Ryzen 7 9800X3D, Really Faster For Real-World 4K Gaming?
Posted by SmashStrider@reddit | hardware | View on Reddit | 390 comments
Posted by SmashStrider@reddit | hardware | View on Reddit | 390 comments
evangelism2@reddit
Cant speak for 4k, but at 1440p on World of Warcraft a notoriously CPU bound game, the lift from my 5800x has been massive. 40-50fps in main cities to 100-120.
Synolol@reddit
What about raids?
evangelism2@reddit
Sadly I stopped raiding a week or 2 before I got my chip (cleared heroic). But Mythic+ saw a similar lift.
_OVERHATE_@reddit
This video was great to finally have to link to when absolute brainless drones keep asking for "wheres the 4k benchmarks hurr durr" in every fucking review.
Also great to see just how INSANELY good the 5800X3D is and how, if you have one of those, you are still in the "not enough of a leap to upgrade" for most cases.
Im on a 7700k (lol) and my 9800X3D arrives on wednesday, i cant be more hyped.
mapletune@reddit
no amount of explanation by HUB, or daniel owen (A LEGIT TEACHER), or GN, or anyone, will be enough to dispel this notion online. they just go straight to the comment section without even watching the video or attempting to understand what's being taught.
it's like actual conspiracies/flat earth/cults or something, the information falls on deaf ears, they turn off their brains, and continue believing whatever they believe. it's amazing, entertaining, unfortunate, and sad at the same time.
GOOGAMZNGPT4@reddit
The consumers know what is happening, it's the reviewers who are not getting it.
No one cares that $2000 GPUs are being paired with $500 CPUs to play 1080p low quality games in a contrived test that is ignorant of real world usage.
There is no reason not to test at all resolutions.
VampyrByte@reddit
They should, as its a good indication of the comparative performance of the benchmarked product. They test like this to remove as much as reasonably possible other variables from the test and focus purely on the part in question. This means that you as a viewer can compare part A to part B.
Having "real world tests" would just make it more difficult to actually compare the parts. You wouldnt test CPUs in a GPU constrained scenario for the same reason you wouldnt test GPUs in a CPU constrained scenario. The results will be largely nonsense with the so-called tested part being starved of work while some other part of the system is where the constraint is the majority of the time.
No one games surrounded by sound proofing, but there wouldnt be any point in testing the noise of a case and its fans by "testing" it with music on, a dog barking and kids screaming in the background. You wouldnt test coolers with the PC next to a fireplace.
Of course there is, the result is meaningless noise that wastes the reviwers time making it, and the viewers time watching or reading it. There are more combinations of different computer hardware and games, than there are computers to play the games on.
GOOGAMZNGPT4@reddit
Wasted breath.
We understand the reasoning, it's a wrong way to test.
No one is spending $2500 on enthusiast level parts to play games in a contrived test scenario.
It's like testing a scuba set in the open air on the beach instead of underwater. It's an unnatural test condition that won't test for the intended use of the product.
VampyrByte@reddit
Clearly not. Nothing is understood properly by anyone who thinks like this. If you want full system tests, watch or read those. A CPU test is, by definition, not a full system test and people conducting one with any ammount of integrity will be doing so while trying to isolate as many variables as possible.
No one buys anything, for any ammount of money to use in a "contrived" test scenario. We don't run static images on TVs for weeks straight to test for image retention, we don't listen to nothing but Dolby Atmos test sequences on our soundbars or drive our cars exclusively on disused runways from 0-60. The point of these tests is to differentiate the performance of competing products and collect reliable data points. If you want to know how well X CPU will work with Y GPU then you can look at the benchmarks for both products and make an informed decision.
Would you want your scuba gear reviewed in the sea? or in a controlled environment that puts lots of different product under as near as possible the same conditions? No one is scuba diving in your temperature and pressure controlled validated test tank you might say, and you'd be right. But if you arent comparing the products under the same conditions you can't be sure you are making a valid comparison at all.
Reviewing TVs at 6 inches from the celing in direct sunlight and watching upscaled SD TV isnt what is done.
Reviewing sound bars playing stereo audio because no one can be arsed to configure Dolby Atmos isnt sensible.
Reviewing high performance sports cars in rush hour traffic is daft.
Reviewing CPUs in GPU constrained scenarios is silly, even if that is how gamers use them, and even should, it doesnt allow you to properly understand the situation, its just noise.
Comprehensive_Rise32@reddit
Is it though? Because that tells us if we already have the best CPU for that GPU for that resolution and settings, no one wants to waste money buying a high performance sports car that's advertised as 50% faster when it's can't go faster in rush hour traffic.
And yes I do accept the argument that the full capabilities will be unlocked in the future with a new faster graphics card, but for the meanwhile LTT showed the 9800X3D is not much faster than the 5800X3D at 4K ultra and that's important to see as one can write off the "fastest" CPU on the shopping list to save some cash.
VampyrByte@reddit
If you test a car's top speed, for example, in traffic, you will find you havent tested the car, but instead the traffic. The same thing effectively happens with CPUs, and in GPU constrained scenarios, which most games are, it ends up being that the GPU is what is being tested.
The reality is that in the overwhelming majority of games the performance is best measured in terms of how it is displayed, and that means framerates and frametimes. But changing the resolution, or even most graphics settings, greatly affects the work that the GPU has to do for a frame, but has very minor effect on the CPU workload. In an ideal world we would have a GPU that can return a frame instantly, but we don't, so we have to settle for reducing CPU load to the point that it is hopefully waiting for the CPU as much as possible. Reviewers tend to do this by using the fastest GPU they can get their hands on, combined with the lowest settings.
If you want to get to work quicker then reading reviews of super cars looking at top speeds is clearly a waste of time. You might save a bit of time in a faster accellerating car when joining a motorway, but overall due to speed limits and traffic most of us arent getting to work any quicker in a Ferrari than we would a Toyota. However if you were buying a car to go racing in, it would be incredibly frustrating if all the "data" was "commute time" measurements.
Strazdas1@reddit
Its not. Thats the issue you and those reviewers are not getting. Drawcall forwarding at 720p lowest settins is in no way indicative of real world CPU performance.
AnimalShithouse@reddit
In most gaming these days, all new CPUs are basically good enough. The comparative testing for ranking is fine.. the misleading part is that consumers might look at the rankings and pick the fastest because they think they'll need it, when the reality is most people would be fine with a 5800x3d, even in 2024. Probably in 2026.
They'll be GPU limited for years to come at 4k and at 1080p, the 5800x3d is already fantastic.
It is a ranking scheme of which CPU is "best" with the omission that "They're all going to play your shit fine unless you play some hard niche game, at which point you probably care more about that specific benchmark, instead of how many extra FPS this CPU is getting in CS2.".
OGigachaod@reddit
False.
INITMalcanis@reddit
>There is no reason not to test at all resolutions.
Other than that it's a huge amount of work. Are you volunteering your time here...?
UglyFrustratedppl@reddit
The only thing that would dispel this myth is if they had to make CPU reviews themselves. Eventually they would figure out that doing native 4K CPU reviews is a waste of time. Once you've seen one you've seen them all.
Fun_Age1442@reddit
im pretty sure daniel owen is a maths teacher not a professor or teacher for computer science or anything
mapletune@reddit
and these commenters are computer science industry experts? lmao~
in case you really did miss the point, we are talking about patience, empathy, ability to break down complex topics into simpler parts, presentation skills, etc. all of which good teachers are better equipped than the average joe.
good youtubers/entertainers/lecturers/coach/trainers/etc do too. but at least in my culture, we value and respect teachers a lot, hence my own emphasis.
Strazdas1@reddit
I think you completely missed the discussion if you think this is what we are talking about.
Fun_Age1442@reddit
im not undermining him or saying that me or any redditor is better, just saying someone like GN would probably be more or have the same knowledge than daniel owens, however him being teacher means his delivery is more easier to understand.
hak8or@reddit
I see your English is mostly fine so I assume English isn't your distant 2nd language (if it is, please ignore the remainder), but it's genuinely humorous to see someone arguing the efficacy of a teaching method and be missing the point while also making such an elementary mistake expressing it.
sabrathos@reddit
That's literally what they said, though. "no amount of explanation from [...] A LEGIT TEACHER [...] will be enough". Sure, Steve is extremely well versed in details around hardware performance and testing methodology, but as someone who watches most of his videos he bounces between an effective tech communicator and losing the plot while trying to figure out which technical details to expand on and when (and how to slip in his sardonic flair).
Fun_Age1442@reddit
My bad then
Strazdas1@reddit
Because HUB, Daniel or GN is wrong on this.
hak8or@reddit
It's genuinely a bottleneck and numbers game, where a lot of folks sadly have immense difficulty with.
If people were able to handle those two, industries like rent a center would fold immediately. Hell, you can do benchmarks at home like this yourself by disabling all but 1 cpu core in the bios, running a game benchmark at 4k capturing average frame times and 99% frame times, and then enable another core, rinse and repeat, throw your values into excel or any online line chart, and see where it flattens out.
While yes, this is absolutely an imperfect benchmark, it's very much more than enough to get a rough idea of where you are in terms of cpu needs vs GPU needs.
Once that flatting out is getting "too close" to when you enable all cores, then it's time to start looking at new processors. Once you don't see that flattening out anymore at all, you are compute bound.
If most people were able to do an exercise like above, the USA or the world would be a wildly different one than what we have today. "Vibesession" would fall back into the laughing stock it was many years ago.
COMPUTER1313@reddit
Saw one insist that the X3D CPUs "only excelled in certain games" due to "low clock rates", in a thread that discussed the 9800X3D.
And multiple folks argued Arrow Lake was still worth buying for productivity workloads, while overlooking that Zen 5 regular matches Arrow Lake's productivity performance at a lower CPU price without having the performance regressions in some workloads, lower RAM kit price (DDR5-6000 vs CUDIMM DDR5-9000), cheaper boards and retains upgrade route to Zen 6 (while Arrow Lake's boards won't even get another CPU generation it).
MiloIsTheBest@reddit
There were several comments shitting on Optimum for testing at 1080p when he tested everything at 1440p.
Able-Reference754@reddit
My point on the other hand is that wouldn't it be nice if there were numbers available on how much better a 9800X3D is compared to 7700k at 4K. Surely you don't think there's no difference just because a reviewer said that CPU doesn't matter at 4k?
Because sure it doesn't matter for the past few gens but beyond that it does matter but any information and comparison regarding the matter is unavailable unless you trust crappy sites as reviewers disregard the issue.
sabrathos@reddit
You have that review already! It's called GPU reviews.
For GPU reviews, tech reviewers specifically choose the highest performing CPU, so you see exactly how many frames a GPU can deliver for a game, assuming CPU cost was 0.
Now, if there's a review for the 7700k with one of the same games, even at 720p, you cross-reference it. These show how many frames a CPU can deliver for a game, assuming the GPU cost was 0. If the 7700k has fewer FPS than the GPU's review, you know that'll be what you'll get, +/-3%. No extra review required.
Remember, CPUs and GPUs pipeline work just like laundry washers and dryers. The slower one sets the throughput. If your washer takes 30min and your dryer takes 50min, your throughput of loads is one per 50min. In a world where "laundry reviewers" only knew how fast a washer was based on "throughput of loads", you'd benchmark by pairing each washer with the super-ultra-mega-dryer-max-XL that dries loads in 5min, so you know that whatever time the load takes is essentially guaranteed to be pinned to the washer's wash time.
That doesn't make the data not useful if you don't own a super-ultra-mega-dryer-max-XL. Just look at the review for your dryer model also, and compare the two and take the slower of the two values. Done.
(Now obviously the 7700k's so old that there's likely no overlap in games. So we're more-so talking something like a 3800X or something, or you can get a very rough estimate with some transitive calculations.)
Skraelings@reddit
I never see my 3900x on the charts anymore, that gen has been passed by at this point pretty clearly. When something from that gen is on the chart its usually a 3600/3700x and not mine, granted I could make some assumptions where mine would sit, but knowing exactly would be nice.
sabrathos@reddit
I wasn't saying the 3900x would be on the new charts. I was saying games that the 3900x had been reviewed with can still potentially show up on the new charts. So you can still do the comparison side-by-side without having to do transitively between different games.
This is why I think Hardware Unboxed doing the 45-game deep dive reviews is actually quite useful, because you're fairly guaranteed to see some overlap with things like Shadow of the Tomb Raider that can act as decent anchor points for multiple generations back.
You'll still have to do some sanity-check scratchpadding to account for possible changes in test location (though I think HUB keeps that fairly consistent) and small variations from Windows and driver updates, but the result should give a good idea of how CPUs from generations back can stack up with current ones.
Skraelings@reddit
Plus agesa updates too.
Yeah could correlate it that way but I’m also a little lazy so hoping it gets included most times.
sabrathos@reddit
Well in that case just get a 9800X3D, it's guaranteed to be on pretty much every chart for at least 5+ years 😂 Like the 5800X3D before it.
Skraelings@reddit
Lmao facts.
_OVERHATE_@reddit
> wouldn't it be nice if there were numbers available on how much better a 9800X3D is compared to 7700k at 4K
Luckily for me im not tech literate so i can easily do this wonderful thing called Extrapolation. Look how much the 9th gen was over 7th, then 11th over 9th then 13th over 11th and i get a pretty decent picture of big the jump will be.
Charts dont lie, no GPU is capable of outpacing CPUs at 4k. At 4k, in 90% of the cases, GPU will be the problem, not CPU. And that remains true for the last 3 or 4 gens of CPUs
Able-Reference754@reddit
Of course you can extrapolate for an estimation, but you can also have a reviewer that does their job and presents that data to you directly e.g. what techpowerup does.
notafakeaccounnt@reddit
On the other hand if you don't have 5800-7800"3d but want to get AMD this shows how long 9800x3d can stay relevant with just GPU upgrades.
T-Baaller@reddit
I'd bet you could be happy with a 9800X3D until AM7 is coming out
Saying this as a 5800X3D enjoyer who will probably end up skipping AM5.
mynewhoustonaccount@reddit
Only reason I jumped from my 5800x3d was my X570 motherboard started acting flaky after several years of reliably service. I figured why not jump now. My 5800x3d instantly sold for almost 300 bucks on ebay, so that subsidized the upgrade a bit.
CatsAndCapybaras@reddit
I'm on 78x3d. I will consider upgrading to zen6 x3d IF it is on the same platform. If not, I'll wait until zen7.
elessarjd@reddit
Wait. But the 4k testing with upscaling in real world scenarios is exactly what people are looking for. Isn't that the sort of extensive information we all want?
notam00se@reddit
no, no. Everyone should see the main 1080p review results and be able to extrapolate how it performs at 4k without a 4090 with your own setup.
Like all of the titles were "Intel is dead", but this video shows that If you bought a 285k and play at 4k, it is a bad value but might not make any difference in the games you play to swap to x3d.
Braindead to buy 285k just for gaming, but not the end of the world if you did.
R1ddl3@reddit
That was not the takeaway from this video at all though? The 4k results shown here couldn't be extrapolated from 1080p results.
notam00se@reddit
Yes, I was a bit /s.
Arguments against anything but 1080p CPU reviews say that 1080p resuts are enough for even first time builders to know how a complete system will perform at 4k.
nanonan@reddit
I'd like to know both things, how it will perform down the road and how it performs in actual usage right now. This testing shows there are real world advantages at 4k right now in some titles, and the 1080p data doesn't tell you which titles those are. That's also important consumer information, and people aren't brainless for wanting it.
PM_your_Tigers@reddit
This is the first time I've been seriously tempted to replace my 9700k. I'm still on the fence, factorio is where I'd see the biggest improvement, but even there I don't build big enough to push the 9700k that hard. Anything else I need to upgrade my GPU, and I'm not sure the generational leap from my 3060ti is going to be enough to justify it.
But this would give me headroom for years to come...
RedSceptile@reddit
Tbh as a fellow 9700K gamer even though I don't play Factorio (great game) I've started noticing enough stress in most of the titles I've played in the past year to make the decision and bought a 9800X3D Friday. This video helped me feel more comfortable with the purchase (I'm doing a GPU upgrade as well in the Spring) and kind of helped me think more long term as well.
tl;Dr like my 9700K but also realize maybe now is the time to move on and data has helped make decision easier.
elessarjd@reddit
I'm in the same exact situation (9700k + 3060 TI), but this video was finally the evidence I've been looking for to upgrade my CPU in preparation for a new GPU. I know I'll either be getting a 4080 or 5080 in the coming months and this video has shown that even 3 years down the road having that headroom you talked about will be beneficial.
SirCrest_YT@reddit
Feels like HUB end up doing this for every launch.
OGigachaod@reddit
These aren't real 4k results, these are BS DLSS results.
_OVERHATE_@reddit
Oh hey another person who didnt watch the video.
Ok so tell me, oh genius of pc hardware, what do you think the 3090 section shows???? That 4K even with DLSS on will choke CPUs. The 4090 CANT do fucking 4k without DLSS at high framerates, it will choke ANY cpu thrown at it, and will show now difference. ITS IN THE FUCKING VIDEO.
God dammit.
OGigachaod@reddit
That doesn't change what I said.
semitope@reddit
I mean.... this doesn't really help your case. For you it matters, but you might still be cheating yourself losing out of more overall CPU performance by getting the 9800x3d. I don't know what's wrong with the reviewers reasoning or what their reasoning was in using the 7700x and a broken 285k in this. But a 14700k would have been better comparison since its supposed to be significantly faster than the 9800x3d outside gaming. the 7700x is slower than the 9800x3d outside gaming so it really had no chance in gaming to begin with.
_OVERHATE_@reddit
Nice try user benchmark
semitope@reddit
I didn't say 13600k. Why are none of you questioning their purpose choices? We know the 285k is broken and the 7700k is just a worse CPU (though almost half the price).
natsak491@reddit
Swapped my 5900x out for one over a year ago now it feels and it has been great. 32gbs ddr4 3600mhz cl14 I think and I swapped my 3090 fe out for a 4090. May upgrade the 4090 to a 5090 because I have a lg c2 and lg 32” 4k monitor I play on.
thatnitai@reddit
I genuinely don't understand why there's so much discussion all of a sudden about how top end gaming CPUs are less important because 4k sees little difference
I'm close to believing there's a conspiracy here
Able-Reference754@reddit
For me the issue is that when I go into a CPU review I want the answer to 2 questions:
Is my old CPU enough to cause a bottleneck yet?
What's the best thing to upgrade to.
Something like this review answers it well.
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
I have a 5900X and see that it's still as good as new for 4k, but some day it will be in the same spot as that 2700X and I want to know when that time comes, not just have one of the major questions waved off with "doesn't matter at 4k" like all of these video reviewers seem to do.
Flynny123@reddit
I'm largely on the side of HUB here but wanted to say this is a perfectly sensible comment that helped me understand much better.
mapletune@reddit
nah~ /u/Able-Reference754 is just lazy.
since he brings up TechPU has 4k results but HUB doesn't. let's see how much that actually matters in regards to his 1) 2) questions using Cyberpunk RT results.
1) am i cpu limited 4k?
on Tech PowerUp 4k RT Cyberpunk shows every CPU on the list at around 44fps with less than 2% diff, within margin of error. basically ranking every CPU as the same (which LMAO they aren't). so that answer's OP's question yea? easy enough. that's what they want.
1a) this only answers the question for those with 4090 of similar spec as TechPU, what about people with other high end GPU that can do 4k and can accept % less performance. let's say 30fps instead of 44fps? it doesn't answer the same question for them. do reviewers have to do 4k test for every game for every GPU within 2-30+ fps range INSIDE OF A CPU REVIEW to placate the crowd?
1b) they could have gotten the same conclusion two ways without the 4k results. either using a diagnostic tool to see their GPU utilization while gaming. if it's constantly at 95-99% thus determining that they are GPU limited, hence any CPU upgrade would not matter (same conclusion.) OR, looking at their ingame fps number, then checking a review of their GPU such as TechPU 4090 4k RT Cyberpunk results, and see that both their fps and TechPU fps are the same, 44. Thus conclude that this level of performance is the max that their GPU can do. Therefore a CPU upgrade would not yield improvements (same conclusion.)
1b solves 1a problem that it applies to everyone and every GPU either by having people look at their own statistics, or checking reviews for their own GPU on any given game & quality (1080,1440,4k,low/ultra) because GPU reviews do have that range of data since it's actually relevant.
2) best thing to upgrade to...
does Tech PowerUp 4k RT Cyberpunk 4K results really answer this question? best cpu to upgrade to? according to this data 5800x3D, 7600X, 9800X3D, 13600K, 14900K, 285K, are all "equally best" at ~44fps...
obviously not. the best gaming CPU atm is the 9800X3D and the best productivity cpus are intel high end multicore ones, and recently top tier AMD non-X3D cpus are competent as well. but are those the "best thing to upgrade to"? in this respect, neither 4K results nor HUB results can give an answer that applies to everyone's different needs & circumstances. that's why there's are cost per fps data / performance per dollar, on both TechPU/HUB/any reputable review. because this helps people at all budget level determine their best bang for buck and whether it's worth stretching that budget to go a tier higher for an outsized improvement.
Most games are more GPU relevant though, hence the emphasis on checking and determining GPU performance first. only if you see games perform the same FPS across difference range of GPUs (gpu reviews will compare gpus) would that mean a game is CPU limited on that tier of GPU, thus you'd have to check CPU reviews.
TLDR;
Why don't majority of reputable review outlets have 4k? because you can determine the same conclusions based on GPU reviews PLUS 1080p CPU reviews combined. you need to do the work yourself.
Why don't they just add it to make life easier for 4K gamers? because that's more work and would only solve it for users with highest tier of GPU at time of CPU review. what about 4k gamers who are using 4080,3080,7900xtx, etc? imagine they are also lazy and want to see results that mirror their own setup instead of cross referencing themselves. What about GPU reviews? people don't always have the best CPU in the market in which reviewers use to test GPUs. shouldn't reviewers test GPUs with a variety of CPUs so people can see results that match their system instead of having to cross reference and make assumptions?
yea no. if you want to that, do it yourself or pay someone to do it. there's no market for that kind of data on free content platforms or free articles. why? again. because people can make informed and accurate decisions by cross referencing data from both GPU and CPU reviews together. do the work and look at both. from out perspective you guys are just whining.
Comprehensive_Rise32@reddit
Not interested in reading someone's personal thesis when it starts and ends with an insult.
Skraelings@reddit
whlie even at 4k my 3900x is starting to show its age. As is upgrading now would be a nice 11% boost it looks like on the chart? Dont game at 4k so its probably even more relevant for me.
elessarjd@reddit
Agreed, this is a great point that I'm surprised many others don't understand.
ClearTacos@reddit
Shouldn't you be the one to know that? It's not a perfect metric, but does your GPU utilization drop a lot below 100% often, or are you seeing many stutters? If yes, new CPU will help.
If you want a new GPU and want to know if you old CPU will be an issue, simply drop your rendering resolution on your current GPU until you're getting about the same performance uplift as the new GPU should, and go back to my first paragraph.
Is this really that hard to do? Do the CPU reviewers have to test every single component configuration, at every single common resolution, at multiple settings presets, in hundreds of games, because people are just that lazy?
Thotaz@reddit
It makes no sense to seek out CPU reviews to answer question 1. You can easily tell if your current CPU is bottlenecking you by just looking at your current performance and your current CPU/GPU utilization.
In fact, I'd say doing your own testing is far better because you get to test out the whole game and determine for yourself if it's bad enough to justify an upgrade. Reviewers have to worry about reproduceable results so their testing is not always representative of the whole game (for example there's rarely any multiplayer testing)
As for question 2, the answer is obviously the CPU that gives the greatest overall performance boost with some consideration into the price.
I don't see how 4K benchmarks come into play here. I mean would you want to just upgrade to the cheapest CPU that gets you 60 FPS or whatever at 4k regardless of the overall price/performance ratio?
Last time I upgraded, I did it because I wasn't satisfied with the performance in Battlefield so I found some benchmarks with my CPU and the CPU I had in mind and compared the two. I could see it was XX% faster clock for clock and because it had 2 more cores and I knew Battlefield could utilize them I also factored that into the calculation. And you know what? I was pretty much bang on with the performance I predicted.
GarbageFeline@reddit
It's not all of a sudden. This discussion has been ongoing ever since 4K capable cards became a thing. Every single review of new CPUs is the same discussion over and over again.
And the discussion is not that CPUs are less important, it's about the testing methodology.
Flaimbot@reddit
and before that there was the 720p vs 1080p benchmark discussion. and in 20 years there will probably be a 4k vs 16k discussion.
each year a new generation of pc gamers starts building their first computer, so they keep asking the same stupid questions that have been discussed for decades already.
Strazdas1@reddit
and you keep giving them the same wrong answers?
DorianCMore@reddit
It's been a discussion for years. HUB Steve addresses this criticism every few months, in the same smug and condescending tone.
Most gamers I know play high/ultra on mid-range GPUs and would be much better off moving some money from CPU, ram and motherboard budgets to the GPU budget. But instead they're pairing RTX 3060 (or lower) with 5800x3D and 3600MT CL14 ram, because youtubers showed those 5800x3Ds mopping the floor with the CPUs they should have bought, in 1080p mid/high on RTX 4090.
It has nothing to do with AMD taking the gaming crown.
Temporala@reddit
All they need to do is to hold on to their more powerful CPU's for longer and it balances out in the end, and upgrade their GPU's when they feel like it.
It's only a problem is they upgrade their entire rig besides GPU too often, and it's still not your problem anyway. Tell them what would be a good GPU at the time and leave it at that.
lebrowski77@reddit
In many countries the x3d chips have become insanely expensive or unavailable due to these unnecessary purchases. People who could actually use those chips to it's full potential now have to pay double cos Timmy wants to pair it with his 6700xt or 3060 ti and play at 1440p 60 fps.
Strazdas1@reddit
There is a lot of discussion because reviewers have sold you the lie that CPU does not matter by omitting games where it does matter.
Successful_Ad_8219@reddit
When AMD does well, the excuses start to become obvious. Last week there was a thread at the top of the sub about some bullshit about how the 9800X3d may not be worth the upgrade because 4k bottleneck. What a narrow and naive thing to say. It looks a lot like people recoiling because team blue flopped and that bruised their pride. This 9800x3d is definitely worth celebrating for it's delivery where Intel did/will not. Why bash something good? That's what bitter people do.
Sleepyjo2@reddit
So first off, this discussion happens for every new CPU launch in the upper price brackets. People recommend i3/i5 CPUs over i9s despite their inherently lower 1080p performance in gaming for a reason. What the processor, or brand, in question is doesn't matter. Same reason people still recommend the 5800X3D or non-X3D 7000 parts, they simply provide enough performance.
This is a roughly 500 USD processor. Its *expected* use case is in 4k. In 4k there is an extremely minimal impact when upgrading to this CPU, currently, from nearly any other product. There are roughly 200 USD processors within a few percent on the average FPS. Its primary advantage is in the lows but for a lot of people at that resolution those lows only matter if they dip notably below 60fps.
Does that mean its a bad chip? Certainly not, I plan to eventually buy one (or the 7800x3d if it ever drops down). Its just extremely expensive for a 1080p setup or not a notable upgrade for many 4k setups.
What people seem to miss, or just ignore for the sake of arguing their point, is that people don't want 4k to replace 1080p on the reviews. They want the other resolutions along side 1080p. TechPowerUp does this and it makes them, frankly, an amazing review source.
Also this video is slightly dumb. They're using balanced so we're effectively comparing 1080p and like 1220p or whatever that equates to. Thats not going to show the kind of scaling numbers that the people bringing up 4k are asking about. Quality scaling sure, a lot of people use that but the 4k crowd looking at 500 dollar CPUs and running 4090s aren't going to run lower scaling if they can help it. You can even see it in *their own poll* where almost as many run native 4k as they do performance/balanced combined (with quality far in the lead).
People want to see 4k because they're buying a 500 dollar CPU to play things at 4k. They don't want to extrapolate theoretical values from 1080p results that don't matter to them and won't matter to them for at least half a decade or more. If spending 200-300 of that CPU budget somewhere else gets them more value then thats useful information. Thats it.
thatnitai@reddit
I have to disagree. I follow the top brackets coverage often and am seeing this trend of meh because 4k strong as this for the first time I feel.
Successful_Ad_8219@reddit
First off, your first paragraph is a complete non-sequitur.
Secondly, conflating the price of a CPU with a particular gaming resolution is a weak correlation at best. The data clearly shows that this CPU is beneficial at lower resolutions for those who prefer frame rate over sharpness. This is nothing new. So 1080p /1440p gaming is still vey relevant and is also why it's tested. This doesn't reflect my personal use.
Thirdly, I never claimed any chip is a bad chip.
Fourthly, I'm not denying the usefulness of data.
The rest of it is also not what I'm talking about.
My main point is that there is a large bias against AMD in this sub. We got weeks of AMD 5% when their recent standard parts released. Now that Intel Flopped, we see markedly MORE AMD discrimination instead of a proportional "Intel -15%" meme, or what ever.
So to be more specific, when I was replying to this thread, I was pointing out that the reason they see so much discussion, especially now, its because of the bias from the majority of this sub. That's it.
hal64@reddit
It was always the reality. The origins of you only need a core i5 for gaming. It's why people used those at 1440p+ or used early ryzen like zen 1 and plus at higher resolution. Cheap same performance and easy to upgrade when it matters (especially for am4).
inyue@reddit
Start talking
Wooden-Agent2669@reddit
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
Simply because its the reality
DaBombDiggidy@reddit
What’s there not to understand? It’s a high end part, usually people buying them have high end monitors too.
kasakka1@reddit
Techpowerup's reviews do a much better job by including generational data of 1440p and 4K, with the fastest GPU available atm.
They mostly tell me that nope, my 13600K being 3% slower than 9800X3D at 4K means no sense upgrading even as a 4090 owner. Even 10% at 1440p doesn't make it worth the money.
I get that doing this on a whole pile of CPUs is not very rewarding, but it can still be useful answering "Should I upgrade my older system if I'm a high end gamer?"
greenscarfliver@reddit
Maybe we should move away from calling less than 2 year old hardware "older". Sure it's "older than", but there are a lot of people running 5+ year old equipment which really would be showing it's age by now. This isn't the 90s where you hardware was obsolete before you opened the box
Skraelings@reddit
can confirm my 3900x is starting to struggle.
HypocritesEverywher3@reddit
Yea. I'm upgrading from 5600x and I'm pretty happy and excited for it
kasakka1@reddit
I totally agree. Doesn't mean I'm not curious how the latest tech performs in relation to what I have.
Herbdoobie710@reddit
It very much depends on the games you're playing. Certain cpu bound games will definitely see a larger increase in frames even at 4k with a 4090
SomeKidFromPA@reddit
This. Every thread on “cpu at 4k” ignores this. Every answer is essentially, “it won’t help much with this game that is incredibly graphics intensive but pretty shallow under the hood” (Cyberpunk being the one used in this thread.) so don’t bother upgrading from your 4 year old processor that doesn’t have as many threads or cores as the newer ones.”
but I play a ton of games like Cities Skylines 2, Planet Zoo and (now Coaster 2), CK3 and EU5 when it comes out.. all of those games will see a pretty large improvement based on the cpu, even at 4k.
Immortalius@reddit
I play lot of those sim/tycoon games, how does new cpu perform there?
Hayden247@reddit
Yeah, while for vast majority of games yes the GPU being the bottleneck at 4K Is true I do play stuff like HOI4 and EU4 a lot (HOI4 is my most played game on Steam) where game speed is directly bottlenecked by CPU performance so even at 4K with a mid range GPU the GPU is still asleep in these grand strategy games while the game is being bottlenecked by the CPU. And in stuff like HOI4 single threaded performance is what REALLY matters for game speed, 3D cache also apparently helps too but regardless faster CPU mean faster game.
And what I would like to know is how much faster a 9800X3D is at running HOI4 at speed 5 vs my 7600X, that would be interesting. However regardless I should probably wait for a 11800X3D as that will be the AM5 king after all but even if I remain on my 6950 XT then I will see gains in HOI4 and other games like it even if literally everything else needs GPU upgrade. Reviewers need to bring these games up more, GN has Stellaris as a benchmark and it is same game engine as HOI4 so cool but HUB seems to completely neglect this genre where a CPU upgrade actually does help performance because the games are CPU bottlenecked by their very nature. Of course they aren't good games to put on an average FPS list so maybe that's why but realistically yes a Stellaris, Victoria 3, HOI4, those kind of benchmarks would be useful for literally everybody who plays said games and good data.
Jonny_H@reddit
As someone who plays sim-style games a lot, I'm still very interested in new CPUs even if I am running at 4k :)
CountryBoyReddy@reddit
Some of the games showed 30% percent improvement even at 4k for CPU bound games. The funny thing is the testing was actually incredibly insightful and useful.
This is clearly a CPU leaps ahead in terms of gaming performance. It has me considering it as my next upgrade as I've been static for about 4 years.
HypocritesEverywher3@reddit
But you'll be using dlss at 4k so it's more than 3% because internal rendering will be lower.
_OVERHATE_@reddit
Bro you have a 2 year old CPU, why the fuck are you looking to upgrade? what the fuck is that kind of Apple mentality there?
Strazdas1@reddit
because for some people turn simulation no longer stuttering in their strategy game is worth the upgrade.
misteryk@reddit
let them upgrade. I want more last gen stuff on used market the same day new gen comes out
tukatu0@reddit
It's the reason pc pricing is so """". This is more the norm than not
Merdiso@reddit
Mate, there are people who literally have nothing going on in their lives rather than keeping their PCs up to date, this is where the mentality comes from.
Successful_Ad_8219@reddit
3% where? Averages? 1% and 0.1% lows? Game specific? That's a generalist claim that's not entirely useful. The reason why detailed testing is so important is because I want to know, specifically, the details so I can be informed. I don't need an average chart. I want to know how it runs in the specific thing I do. For me, the 9800x3d is a no brain upgrade.
Hugejorma@reddit
You can be easily CPU limited even with your GPU at 1440p & 4k resolutions. Digital Foundry and Daniel Owen have made multiple videos showing this. More data from Eurogamers/DF CPU reviews. Use high RT options will impact massively the CPU performance. You can test how Path Tracing affect the CPU performance even with Cyberpunk (high crowd density). My 5800x3D is a massive bottleneck on multiple scenarios. I personally don't care about the average fps that much, but I would like my gaming experience way more fluid with better frametimes. I'll upgrade soon to 9800x3D.
All these CPU reviews should have high RT and path tracing tests. These will impact the most when the new RTX 50xx GPUs are released. Those will have most likely double the RT performance, so we'll see a lot more CPU limited experiences when using RT/PT.
Raikaru@reddit
That is not at 4k. You're literally showing DLSS performance which uses 1080p as the base resolution
Hugejorma@reddit
It's the most realistic scenario what people actually use with 4k monitors and TVs. The 4k DLSS performance allows high framerates and visual settings... especially on high RT or even way often with Path Tracing. The difference with higher rendering resolution is minimal, but massive hit to framerates.
Even if you play higher resolutions, the framerate is going to be the same issue. For example, If I use higher rendering resolution, then I have to set lower, more optimal graphical settings. I still have the same framerate target, no matter what the rendering resolution. RT on multiple games can't even run stable 60 with the best last gen CPU. Even less if used PT.
This is a massive issue for everyone, no matter the resolution. Mostly because the RT affect the CPU loads so massively. When the CPU is already at the range of bottlenecking, then turn on RT features and goodbye smooth frametimes.
elessarjd@reddit
Which makes the CPU even more relevant for longevity because we see the most gains with more powerful CPUs at 1080p, so even people running 4k will see those gains with DLSS.
ExtendedDeadline@reddit
TPU is goat for reviews and they do it written format. I love them.
ClearTacos@reddit
TPU does a lot of good things but I don't think CPU reviews are one of them.
I pointed it out in their day 1 review, but for example, they test Hogwarts Legacy in an area where they reach 300fps. There are, however, parts of the gameworld where 7800X3D struggles to lock to 60fps if you use RT, and is I believe around 90-100fps without RT.
Testing that area is a realistic scenario that average person would care about, but they don't test there and it looks like there's no different between CPU's in that game.
ExtendedDeadline@reddit
That's fair!
Hogwarts seems like a game where the devs really could have put more time into optimization.
timorous1234567890@reddit
Nah.
Their charts are useful to reference but they have made some odd choices over the years so I don't think they are top tier.
ExtendedDeadline@reddit
Do you have some specific examples?
I am really not into YouTube format reviewers and video in general, so I really tend to weigh heavily reviewers that actually still publish good, meaty, written content, e.g. TPU, phoronix, chips and cheese. I think I missed 1-2 more good ones on this list, need more coffee.
timorous1234567890@reddit
Civ 6 FPS testing? Why, the relevant test is turn time - That game has been dropped now though.
23H2 testing for the 9800X3D and the other Zen parts rather than using 24H2 (sure stick to 23H2 for Arrow Lake).
Older reviews often used weaker GPUs for the CPU limited testing like a 3080 or a 1080Ti when 2080Ti was available rather than the fastest available GPU to minimise the influence of the GPU. This was apparent in their Zen 3 review because they were still using a 2080Ti for the CPU test suite rather than a 3090 and it showed some anomalies.
Some strange choices on the RAM to pair with CPUs can crop up now and again although it does usually get fixed eventually.
Their GPU tests are excellent and I do like some of the non gaming tests they do for CPUs like emulation testing (which is also gaming I guess, just a different kind). So yea. I put them behind GN, HUB/Techspot, ComputerBase and PCGH in terms of reviewers but they are still a decent source and far above the likes of Toms Hardware for example.
As for written content HUBs reviews are on Techspot if you want to read them. GN publish their reviews in article format again now. Unfortunately I don't really see us going back to the good old Anandtech deep dives. Chips and Cheese will be the closest for that.
GrumpySummoner@reddit
I think I’ve heard one of the reviewers mentioning that AMD recommended to use 23H2 for testing because of the stability issues in 24H2
timorous1234567890@reddit
That was TPU and it only seems unstable on Intel hardware.
TheRealSeeThruHead@reddit
That’s not true for everyone. I’d upgrade in a heartbeat for 10 more min fps.
madmk2@reddit
thank you for saying it. The people that say 1080p benchmarks are worthless are legitimately insane, but including a slide or two with 1440p numbers so people with older systems have a more valuable reference point is 100% worth including them IMO.
I saw someone else refer to 1440p benchmarks as a reality check, which was a neat way to put it. It's the easiest way to prevent someone just getting caught up in the hype because there's a new shiny blazing fast thing, which in reality most likely wont change your experience by that much.
Doikor@reddit
Most of the time the right move is to just save the money for the next GPU. You do this for as long as you can until you get to the point that you are actually CPU bound at whatever resolution/settings you actually play at. At that point you start looking at what is the best best/most sensible CPU upgrade for you.
timorous1234567890@reddit
Sure if you play the kinds of games they include in their suite.
For people who play different games then that may not always be the case although the 13600K is a very good CPU so it is not surprising that there is nothing really worth upgrading to right now.
r1y4h@reddit
This is not your typical cpu review. The video is simply pointing out why they test on 1080p and not on 4k. The upscaling is exaggeration for demonstration only.
id_mew@reddit
Will I see any benefit upgrading from 12900k to the 9800X3D if I game on 4k with a 4090? Will be planning to upgrade to a 5090 eventually.
Z3r0sama2017@reddit
Depends on the games you play. Do you play a broad spectrum of games? Probably not. Do you play a large amount of graphically undemanding simulation games that will batter a cpu even @4k? Very yes.
id_mew@reddit
Thanks for the reply. I play mostly single player games and no simulation games. Maybe I'll cancel my preorder and save for the 5090.
Madvillains@reddit
In the same boat with ddr4 12900k. But I'm at 1440. I don't think we will see more than 15% uplift in max and 1% low frames
timorous1234567890@reddit
The future proof section really drives home why you test at CPU limited settings in the CPU reviews.
In 2022 with then modern games the 5800X3D had a 15% advantage over the 5800X with a 3090Ti at 1080p native and no advantage at the 4K DLSS Balanced setting.
In 2024 with current modern games the 5800X3D showed a 25% advantage over the 5800X with a 4090 at the 4K DLSS Balanced setting showing what can happen after a GPU upgrade.
It just shows how bottle necks can shift over time and while at launch a CPU may be very GPU bound at high resolution settings does not mean it will stay that way through the life of the system. GPUs are very easy to upgrade vs a CPU and both are a lot easier (and hassle free) to upgrade than doing a whole new system build.
-DarkIdeals-@reddit
I'll give you credit. You are the first person who has explained this in a way that makes any sense. The issue is that very few people play at more than DLSS Quality mode due to artifacting etc.. so while it may be relevant in the future it isn't as relevant now. (HUB also has a real bad habit of lying to your face like how this video says "80% of people we polled use DLSS" while including the 55% who said they don't play at 4k lol. Without that it was more like 45% who use it and only ~20% use balanced or performance modes.
Vb_33@reddit
This is why techpowerups 720p tests showed bigger gains for the 9800X3D than their 1080p tests. 720p should be the standard for CPU benchmarks.
lutel@reddit
Nah 480p would be better. Even less relevant in actual usage
shuzkaakra@reddit
Just run them at 1 pixel.
loozerr@reddit
Dlss ultra performance
Jokes aside some games do more aggressive culling at lower res and also lower fov - those do affect GPU.
Local_Trade5404@reddit
Personally i prefer to see results for actual usage cases.
Some edge cases are nice for performance comparisons and that`s all really
Strazdas1@reddit
many games significantly reduce LOD at lower resolution btw. Low resolution testing is not representative of real life performance at all nowadays.
Strazdas1@reddit
4k should be standard for CPU benchmarks. Just pick correct games where CPU is the bottleneck.
Skensis@reddit
Why not 480p?
LimLovesDonuts@reddit
Not all games support 480p whereas nearly all does still support 720p.
lurkerperson11@reddit
It would be but some games don't support it. It's firmly in the camp of legacy resolution. The 5090 will probably let the 9800x3d fully breathe.
loozerr@reddit
Both have value and neither is reliable for future.
Infamous_Campaign687@reddit
By all means, but please for the love of Deity occasionally show if it still matters at 4K. Low res benchmarks are great for deciding which CPUs to buy, but isn’t so great for helping you decide if now is the time to upgrade. For that, benchmarks at the resolution you game at, with 1% lows are important. If the benchmarks show no improvement from your existing CPU, that is an important result.
But it looks as if now may be a reasonable time to upgrade an old Ryzen 5000 series even for 4K, especially if you use demanding games that require DLSS.
advester@reddit
Apparently we need a /dev/null video card driver that doesn't actually do any of the rendering.
braiam@reddit
CPU's depends way too much on the kind of game you are doing. Tons of breakables? Physics? Simulations? Those are CPU bound. That's why despite me upgrading my gpu, my benchmark scores in Returnal didn't improve much on the CPU intensive part when upgrading my gpu.
Strazdas1@reddit
It really drives home that reviewers will do anything possible in their power not to test the CPUs with the right games.
there are already hundreds of games that are CPU bottlenecked they are often more popular than games they test but they completely ignore their existence.
FinalBase7@reddit
If you upgrade your GPU to a significantly faster one sure, but generally bottlenecks shift towards the GPU more in games, they get more GPU bound with time than CPU bound.
if you actually had a balanced system like a vanilla Ryzen 5/7 with a xx60/Ti or even xx70 card you will hit GPU limits in new games before CPU limits, this why I still don't see the point of splurging on CPUs, it's still always better to go for a CPU half the price of the best one and upgrade GPU with the money you saved in 99% of cases, xx80/90 class cards is where saving $200 on CPU will not help but below that you can get some real performance gains NOW and in the future if you buy a better GPU.
PalePossibility2478@reddit
It depends on the games you play. I like strategy games, especially grand strategy, so basically every single game I play is CPU bound.
BrkoenEngilsh@reddit
I think CPU bottlenecks are getting worse. We aren't at the point where I would pay equal amounts for a CPU and GPU but my 3080 and 5900x system is bottlenecked even at 1440p. Games like space marine 2, dragons dogma, and spiderman were all CPU limited for me.
capybooya@reddit
It depends on what you notice, but with a fast GPU those frame drops in those few CPU bound scenarios are really noticeable. Maybe owners of very fast GPU's shouldn't complain, but dropping from 120fps 95% of the time to 60fps 5% of the time is quite jarring, even though its 'fine' by most standards.
laffer1@reddit
Cities skylines 2 was the worst. It would max out my old 3950x. I upgraded to a 14700k last year because of that. Of course that was a mistake due to the problems lol.
It used 70% cpu on all core now
timorous1234567890@reddit
The GPU bind depends on the chosen settings. That can be scaled up and down.
There are very few settings that make much difference to CPU performance.
R1ddl3@reddit
Both are good to see. I don't know why he and so many people in the comments are talking as though it has to be one or the other. The 1440p/4k results are still what give you an idea of real world performance today.
Kryo8888@reddit
For those who wonder why Quality upscaling and not native
Kant-fan@reddit
But quality easily won the poll. Why did they test at balanced then?
timorous1234567890@reddit
What is 'real world' to 1 person is not 'real world' to another person.
Also the important part of the video was really the future proof section to show why low res testing really matters.
At 1080p native with a 3090Ti in Shadow of the Tomb Raider, Watch Dogs:Legion and Horizon Forbidden West the advantage for the 5800X3D was 15% over the 5800X. At 4K DLSS Balanced there was no difference.
With a 4090 in Space Marine 2, Starfield and Jedi survivor the 5800X3D has a 25% advantage over the 5800X in the 4K DLSS Balanced setting and a 29% advantage in the 1080p native setting.
So if you decided to jump in on the 5800X3D at launch and then upgraded to a 4090 you are seeing 25% more performance in the CPU demanding AAA games that are available now and when games come out over the next few years that push the CPU a little bit harder the 5800X3D will still be good for 60+ FPS where as the 5800X will probably start to fall below that mark and the CPU limit will be more noticeable.
NeroClaudius199907@reddit
But you're not running every game at dlss balance, nowadays you'll get fg or just run natively. I dont believe the gap between 5800x3d and 5800x is that big in real world scenario. I definitely dont use balance mode because its a lot worse than dlaa/smaa/taa.
timorous1234567890@reddit
What people run the game at will vary from person to person and rig to rig. This is why real world settings is a BS take, there are too many potentially real world configs to actually do it properly.
-DarkIdeals-@reddit
It's not bs and you know it. the reviewers are just not wanting to put the time in to provide a proper variety of results.
Linus tech tips literally did the work and put out 1080 1440 and 4k results and determined that at higher resolutions the 9800x3d is really not worth it. It's so early that these guys are either lazy or trying to keep their free samples and kickbacks that they refuse to let new product look anything less than a massive increase.
Laughable.
Noble00_@reddit
They ran a second poll for 1440p users currently with a sample of 39k people, 43% of them use upscaling, 39% on native, 19% not on 1440p.
Meanwhile for 4K, currently with a sample of 36k people, 36% use upscaling, 9% native and 55% not on 4K.
Since 55% don't use 4k, thus the data being less relevant to the 'majority' of people watching, they went with balance quality setting at around 59% of the screen resolution, so in this case, \~2259 x 1270, which is a middle ground for the polled results of upscaling vs native for 1440p.
If you asked me, yeah, I would've liked to have seen quality, but I assume this was their thought process due to the polls.
-DarkIdeals-@reddit
Except that 2259x1270 resolution is literally only half way between 1080p and 1440p. The entire point of that video was to justify why they only do 1080p testing because higher res testing is "useless". But everyone with a brain knows that unless you play at 1080p with a frickin 4090 and 9800x3d then those results are ACTUALLY useless to us.
The only thing I wanna see is whether this new chip is worth it to upgrade from my i7 10700k at 3440x1440 resolution or should I stick with a 7800x3d or 13700k etc..
They refuse to show us those results because they still for the huge corporations and can't have the new chip they got for free only being 2-3fps faster at best vs their previous gen product.
ApprehensiveBerry202@reddit
Cansever it's AMD unboxed.
teh_drewski@reddit
Quality just moves the bottleneck more to the GPU and while that might have made Steve's point even stronger, it wouldn't have been a good best case scenario for a CPU scaling test.
jm0112358@reddit
I selected quality instead of "Yes, Balanced or Performance" because I but could only choose one and I use quality a bit more. However, I still balanced and even performance quite a lot, and generally like to know how much my framerate would increase if I upgraded my CPU.
jecowa@reddit
If it was me, I'd run the poll, then probably do the tests however I wanted to do them in the first place.
Toojara@reddit
I think it's deliberately for exaggerating the differences because they didn't know what to expect. With how clear the differences are here quality should still see noticeable differences, but that's easy to say in hindsight.
Fun_Age1442@reddit
wandering the same honestly
basil_elton@reddit
Because reviewers tend to test things with a predetermined notion of the 'expected' outcomes.
dedoha@reddit
Because it wouldn't be HUB if he didn't put his personal twist on testing
NeroClaudius199907@reddit
Stop asking questions
Mutant0401@reddit
Personally even if my GPU *could* game at native 4K I probably would still just use DLSS quality to save a bit of power or to help stabilize my preferred frametime. At that sort of resolution it literally is imperceptible.
My 3070 is nowhere near a 4K card, but even DLSS performance is superior to just rendering games at 1080p on my 4K display so there is almost never a situation where I'm not going to use it.
kuddlesworth9419@reddit
I play older games at native 4k on my 1070, I still need AA though as there are still some aliasing. Not tried DLSS but FSR still has a visual diffference to native but I should think the same will be with DLSS. XESS also has a difference.
aminorityofone@reddit
realistically, how much power does this actually save over the course of a year?
PotentialAstronaut39@reddit
An average of 125W vs 250+ in my case on a 3070.
So... a lot.
aminorityofone@reddit
probably like 20 bucks a year, and that would be with heavy gaming sessions for hours every day. The point of my comment was to think realistically about power. Maybe your part of the world electricity is double the cost.
Viend@reddit
$20 a year for someone making $800 a month living in a hot climate with expensive electricity where their AC has to cool down the heat is different to someone living in a cold place with cheap electricity.
aminorityofone@reddit
then you cant afford that gaming pc. Get an xbox of ps5 if $20 a year is to much.
tukatu0@reddit
I agree they can't afford gaming. Suggesting they get a console instead where they'll be forced to a subcription is ridiculous however
PotentialAstronaut39@reddit
Don't know about your part case, but in mine running the computer with as low of a power consumption as possible is the difference between the AC keeping the room at a tolerable temperature VS having intolerable heat and being unable to use it as I wish.
And that added heat also needs to be processed by the AC, worsening power consumption even more.
In my case it's 100% worth it. No wonder I went with the 7800X3D and undervolted it too. I look at the power efficiency very closely when choosing my parts.
mduell@reddit
If you game 4 hours/day 250 days/year, $20?
BrkoenEngilsh@reddit
Same thing but for settings. Even when I have a top tier GPU I'll spend 5 min tinkering settings, and that often gives another 30+% more performance than just ultra.
Z3r0sama2017@reddit
Yeah even though I game @4k I still use upscaling if I can. If it hits a locked 120 I use DLAA, if it needs a helping hand DLSS quality time with the added benefit of fixing pixel shimmering, something I absolutely hate.
SatanicRiddle@reddit
The poll?
Its in the video itself...
So you got no answer as native won.
BrkoenEngilsh@reddit
Native "won" the poll, but the aggregate of upscalers had more votes than native.
schmalpal@reddit
That's the 1440p poll. The 4k poll has upscaling winning.
OGigachaod@reddit
Or misleading marketing.
DabuXian@reddit
Oh, so it’s not a 4K benchmark, it’s a 1253p benchmark. cool i guess
semitope@reddit
that's a bad poll. I use it because my GPU can't manage. lol. If I had a 4090 I'd use quality or nothing.
Klaritee@reddit
This topic has always been the most frustrating to deal with over the years. It's extremely easy to understand why games are tested this way but there's still a large amount people posting about how dumb low res testing is. These comments are everywhere to the point that I think they're trolling.
Surely a brain bottleneck isn't this common?
R1ddl3@reddit
Just because people want to see 4k benchmarks, it doesn't mean they're confused about why low res testing is done. It's nice to see both. The 1080p tests give you the best idea of how a cpu performs relative to other cpus sure, but not a good idea of what performance gains you can actually expect to see if you upgrade.
Klaritee@reddit
Properly done GPU and CPU reviews are available separately where that component is the actual bottleneck to make your purchasing decisions clear.
What people are asking for is garbage data and it's a waste of the reviewers time.
R1ddl3@reddit
Yes, but in your real world use it will not be the case that both components are a bottleneck. At 4k, your cpu will not be a bottleneck. That is why it's nice to see 4k cpu benchmarks, it gives you an idea of what performance you will actually see in the real world. It's not garbage data.
UglyFrustratedppl@reddit
Once you've seen one modern CPU at native 4K you've seen them all. It is a complete waste of time to do a review on a new CPU like the 9800X3D at native 4K if you already know based on previous data how it's going to perform. This is why people making a living off this don't bother with it.
Don't believe me? Become a hardware reviewer yourself, and you'll be forced to see the light as well.
R1ddl3@reddit
Have you actually seen their 4k benchmark results though? There was a decent amount of variation and a surprisingly large overall performance gain. Those weren't results you could have extrapolated from their 1080p testing. In the past, high end CPUs that performed well in 1080p testing have shown nearly 0 performance difference compared to competitors at 4k. That's not the case here.
UglyFrustratedppl@reddit
There is a 2 fps difference between the 9600X and the 9800X3D on Techpowerup 4K tests. Look, I think we should both find something better to do. It's clear this is a waste of time.
R1ddl3@reddit
So why were Hardware Unboxed's results so different?
UglyFrustratedppl@reddit
Their subscribers voted for real-world 4K benchmarks, as in upscaled. Did you watch the video?
R1ddl3@reddit
Most of it. I was mainly interested in the plots. So does this mean you think 4k upscaled testing is worth doing then? Because I still fail to see how their 4k upscaled results could have been extrapolated from their 1080p results.
Strazdas1@reddit
Its easy to understand why its done this way. But apperently really hard to understand why that is wrong.
teh_drewski@reddit
If only. This comment thread is itself an exercise in brain absentia.
djent_in_my_tent@reddit
It’s quite simple. I don’t game at 1080p. I game at 4k DLSS Quality. Ah, the 9800x3d has a completely negligible difference to my 5800x3d at 4k DLSS Performance?
Neat, no need to upgrade then. That is useful, actionable information that helps me make a purchase decision.
The fact that the newer chip is faster but practically benefits only academic scenarios that aren’t relevant to me doesn’t help me make a decision.
A review like this lets me know I can kick the can down the road until Zen6/Nova Lake.
jonstarks@reddit
ppl wanna see 4k numbers cause they wanna compare the difference (from what they currently have) to see if its worth it, that's it. We don't want a guess, we don't wanna deduce from 1080p we wanna know actual numbers.
FitCress7497@reddit
This. People watch reviews to know if it's worth upgrading or not right? I don't need numbers to engage on online keyboard fights, I just need to know if it is good to spend money on an upgrade. That's why we ask for more realistic scenerio.
Beautiful_Chest7043@reddit
More realistic scenario is 1080p gaming not 4k gaming. Vast majority of gamers play at 1080p resolution, less than 5% game in 4k, so tell me which is the more realistic scenario ?
FitCress7497@reddit
Dude ignored the fact that the test was done with a 500$ CPU and a 2000$ GPU, jump straight on steam survey and say 1080p is more realistic. Just how dumb he is
Successful_Ad_8219@reddit
Yes. People do watch reviews for that. The problem is defining "realistic". Your realistic isn't mine. That's why a wide range of testing is key. Give me the data. I'll determine the usefulness based on arbitrary criteria for me. Some people are fine with looking at average FPS at 4k and making a conclusion. I'm not.
LucidEats@reddit
just wait for the 9900 and 9950 x3d's.... AMD have someting mega in up their sleeve
Ice-Cream-Poop@reddit
That's what I'm hoping 🤞
HypocritesEverywher3@reddit
People complaining why he using balanced instead of quality. If anything I want to see 4k performance and see how it compares to native 1080p and native 4k.
r1y4h@reddit
For anyone confused, this video is not your typical cpu review. The video is simply pointing out why they test on 1080p and not on 4k. 4k with upscaling is a typical practical user scenario. But I think he should have used Quality instead of balance.
HypocritesEverywher3@reddit
He should have used performance so we could seethe CPU impact more
XHeavygunX@reddit
Would of been nice to see him compare the 9800x3d to the 7800x3d also
PastaPandaSimon@reddit
The point is that there is zero difference in those settings.
Strazdas1@reddit
This would indicate the problem with testing methodology. In particular - choosing GPU bound games to test.
PastaPandaSimon@reddit
That's literally what the video is about though. Saying that testing at 4K is not a good methodology, because you're always GPU bound, and you get nearly identical performance even using far slower CPU. Thus my response to the person asking why the 7800x3d wasn't included. Because at 4K, the performance is going to be about identical to that of the 9800x3d.
Strazdas1@reddit
Then they are flat outu wrong for the reason i already mentioned - they pick GPU bound games to test and then complain they are GPU bound. Pick different games.
PastaPandaSimon@reddit
Which games would you test to show a difference in 4K? At that resolution, it's very rare to see a difference caused by the CPU. I was a pretty extreme example of someone who played in 4K on a 7600k. That's an old 4c/4t CPU. I upgraded that to the 7800x3d. The difference was there, but I don't recall any games where the difference was clearly noticeable apart from those that struggled on 4c CPUs. I can't imagine noticing difference between any model 8C CPUs in 4k. Perhaps it could be measurable in some games, but it'd be pretty irrelevant.
Strazdas1@reddit
CS2 (cities skylines 2), CK3, V3, CIV6 turn time, Stellaris turn time, HOI4 on autosim mode, Total War (battle mode), Factorio, Teardown (maximum destruction precision will make your CPU cry). Thats already longer list than most reviewers testing suites.
At those games, in my personal experience, im CPU bound with a 4070. In many of them i was CPU bound with a 1070.
XHeavygunX@reddit
Temps and power draw are also as important as fps. Even if the frame rate and 1percent lows are the same, people who build SFF pcs will care about thermal performance between the 7800x3d and the 9800x3d.
PastaPandaSimon@reddit
The power/thermals were measured by TPU and are substantially better on the 7800x3d. The 9800x3d uses ~40% more power per fps in gaming, and nearly 80% more power in compute.
XHeavygunX@reddit
I only seen one benchmark so far of someone comparing both the 7800x3d and the 9800x3d in 4k using dlss to mostly hit 120fps or higher in games and it showed the 9800x3d pulls more watts while having the same or up to 7C improvement in thermals
From-UoM@reddit
Why are you using Balanced upscaling? Its rendering at 1260p.
Not much greater than 1080p....
joor@reddit
If he used native 4k - I think those 3 cpu would show the same fps
CoronaLVR@reddit
So he is gaming the benchmark to fit his own conclusions?
UglyFrustratedppl@reddit
No, he is doing a CPU benchmark. There is a difference between doing a game benchmark to CPU benchmark. A CPU benchmark tries to measure CPU performance only, therefore you don't want to be GPU-bottlenecked in such a scenario.
AACND@reddit
And no more money from AMD?
Able-Reference754@reddit
Then make that video to prove a point. Not tweak shit until you get a result and then make a misleading clickbait title.
_OVERHATE_@reddit
He did... didn't you watch the video? He tests with a 3090 that cant keep up, and shows the CPUs getting near identical performance. Its exactly the same situation. The 4090 cant do native 4k and will just choke the cpus, providing completely irrelevant data.
Able-Reference754@reddit
Bullshit. Proper CPU reviewers show the actual data (and also at which point the CPU does make a difference) rather than handwave away reality and make shitty upscaling videos.
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
egamruf@reddit
You need to watch the video; you're humiliating yourself.
Able-Reference754@reddit
I'm not going to watch some moron blabber on about upscaled results for 20 minutes.
_OVERHATE_@reddit
Oh so you admit you didn't watch the video? Lmao get the fuck outta here user benchmark
Able-Reference754@reddit
Do you think I'm looking for some kind of Intel preferential results here? Nah I want CPU tests to be comprehensive enough that they indicate which CPUs it actually does make a difference when upgrading from (see: TPU charts showing that indeed there is a difference when upgrading from older generations e.g. Ryzen 2000/3000 or older Intel)
Ironically because of reviewers only including latest gen or previous gen products when you're looking to find out whether or not upgrading from say a 8700k to 9800X3D is worthwhile for you as a 4k gamer censoredbenchmark is one of the only results you can find, which is kind of the issue here.
The same issue is heavily present in GPU reviews too. Only very few reviewers have head to head comparisons of for example the RTX 2000/RX 5000 or older and RTX 4000/RX 7000 GPUs making finding information hard (outside of censoredbenchmark, which is notoriously bad)
tl;dr: there is a point where cpu starts making a difference, reviewers should do their jobs and find when there is a bottleneck instead of doing dumb upscaling videos.
timorous1234567890@reddit
how about this? 9900K vs 3700X vs 5800X3D all with a 4090.
Plenty of places do revisit articles on older hardware. It would be nice to include some of this stuff in their day 1 reviews but when under a time crunch to get a review out it can be unfeasable. I am pretty sure that the only reason TPU can do that is because they don't retest everything. GN do that as well but they also put the date that the benchmark was tested so you can tell it is older data as a reference point.
Able-Reference754@reddit
This isn't even a day 1 review though. I wasn't complaining when their first review didn't have the 4k results, but I do complain when an explicitly 4k result video is all upscaled results rather than showing CPU scaling at 4k.
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hey Able-Reference754, your comment has been removed because it is not a trustworthy benchmark website. Consider using another website instead.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
LkMMoDC@reddit
Instead you're going to blabber on reddit about a video you didn't watch?
Able-Reference754@reddit
I checked the charts and saw none of the results are what was specified in the video title and then said it's a shitty video.
egamruf@reddit
Fair - I did implicitly provide it as one of two options. You're more than entitled to choose the other.
9897969594938281@reddit
You make the video
Able-Reference754@reddit
What kind of a person makes a 4k gaming comparison with no 4k gaming comparisons. sub 1440p upscaled isn't 4k.
theholylancer@reddit
Because if you looked at actual 4k results, the CPU matters even less.
Upscaling actually needs a bit more than normal CPU processing, so it would actually show some things.
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
this is a test where the 9800X3d gets 102 FPS, and the bottom most 2700X gets 77.9 FPS
I would bet that my old 9600K OCed to 5 Ghz would get 90+ fps there too
because at true 4K, the CPU is not really the bottleneck.
Strazdas1@reddit
Typical. Not a single CPU-bound game tested. Conclusion is that games are not CPU -bound.
theholylancer@reddit
I mean, they even gave the best game for that, just not here...
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/8.html
I upgraded to a 7800X3D from a 9600K @ 5 Ghz despite being on 4k for the longest of time because of Emulating TOTK with Ryujinx, and to run it at 4k60 (pre a lot of patches) you need that kind of grunt in the CPU to keep it stable, esp in the underdark.
But if you talk about the things like factorio or civ turn timers, yeah there are better outfits for them and not TPU that is more general gaming short of the emulation tests that no one else has the balls to do.
Strazdas1@reddit
Are you implying that Red Dead Redemption is the best game to test CPU?
I am.
I think all outfits, especially the major ones, should test theese games for CPU because they are games that are actually CPU-Bound and you dont worry about GPU-bottleneck there. Not only that, but they load CPU in a lot more ways than just Drawcall forwarding like in most games they test.
theholylancer@reddit
Umm...
Its Emulating RDR on a PS3 emulator
so yes, it is a very VERY good game to test CPU on
Strazdas1@reddit
I mean sure, PS3 emulator does utilize CPU a lot and it is a valid test. Actually its probably even more useful test because it tests different workloads for CPU than traditional gaming, bringing more perspective.
TheRealSeeThruHead@reddit
We don’t care.
chasteeny@reddit
Did the video rest RT as well? RT hits cpu as welll, but not sure how the x3d parts do in that regard
theholylancer@reddit
RT hits GPU so hard that while yes, it needs more CPU as well, it makes the bottleneck even more GPU
chasteeny@reddit
At balanced this would definitely be offset though
Able-Reference754@reddit
Have you considered that maybe the fact that a 2700X has ~80% the perf is an answer people would want to for example give an indication whether or not upgrading is worth it and what it makes sense upgrading to?
Oh of course CPU & GPU comparisons are only done to the last gen by these shitty youtubers. Who would ever compare performance to something more than 2 years old.
theholylancer@reddit
I mean, that is why its tech articles
and......... LTT of all people mentioned this very well in their 9800X3D video, and also the same with their building last gen video.
There are different flavours of techtubers, and it seems that only the people who are making video for MASS consumption is actually speaking what I would consider the basics.
basseng@reddit
Because it is 2024 and no one cares about native 4K when DLSS balanced, even quality in some games, looks and performs better.
The entire point of this video is to look at the real use case scenario of AAA 4k gaming. You want the theoretical the standard review covers that.
Sure DLAA is even better, but the performance loss for that for a negligible improvement is pointless.
4K native only matters for older titles without DLSS, which are irrelevant for someone buying a top end system today, they'll run fine.
AACND@reddit
Whaaaaat? DLSS balanced looks better? Oh god. Enough reddit today.
basseng@reddit
4090 and a 77 inch qd-oled, it does not look better on my 1440p monitor, not enough base resolution.
But as I said. In motion in 4K, when you're not pixel peeping with your face against the screen or A/Bing screenshots looks almost indistinguishable 95% of the time, worth it for vastly better 1% lows, especially in RT titles.
But hey I'm sure you have golden eyes and can see individual sub-pixels on your screen.
Nobli85@reddit
I'm an outlier but I play 99% of my games at 4k native with a 7900XTX. My raster is closer to a 4090 than a 4080, the card is extremely tweaked and tuned. My use case is 'real world', just less prominent than others.
SolaceInScrutiny@reddit
You play 99% of your games at native because the alternative is FSR which is garbage. If you had access to DLSS this wouldn't be the case.
Nobli85@reddit
It's actually not garbage at all on a 32 inch 4k screen, but playing at 4k on an AMD card is less than 1% of PC gamers use case I would imagine. People like to talk out their asses based on a video showing 800% zoom to trash FSR, but they haven't seen it for themselves. I play in native because I have the raster power to do it. If I wanted equal or better performance from Nvidia (and enough memory to run some of the AI models I run) then I have to spend double. And I like my money better than I like having DLSS.
virtualmnemonic@reddit
XeSS is available in nearly all titles that a 7900xtx can't max at native, and it offers a superior experience to FSR.
basseng@reddit
I'd bet £20 though that is FSR 4.0 (or whatever they call their machine learning version they release) is just as good as dlss is now you'd toggle it on for the free 20-30% performance and better anti-aliasing.
Even if you're capping your FPS (I donton 117non my tv for vrr) still worth it imo to save power. BBC
Nobli85@reddit
I cap at 144. If it actually is as good as native, yeah I'll use it to cut back on the card sucking back power.
FitCress7497@reddit
it's not even quality upscale
FinalBase7@reddit
Yes, upscaling is not a problem and it perfectly fits into the "realistic use case" scenario this video is trying to create but people go for quality not balanced, especially with these ridiculously high framerates, can even go for native in more than half of these games.
JensensJohnson@reddit
The kind of person that actually games at 4k, if you did too you'd know DLSS works so well it renders native 4k pointless
Able-Reference754@reddit
I do game on 4K, just not with upscaling (AMD moment).
JensensJohnson@reddit
Understandable, have a nice day
Healthy_BrAd6254@reddit
1440p upscaled to 4k is visually basically identical to 4k. See HUB's \~30 game comparison between 4k native and 4k DLSS Q
Running 4k natively does not make sense anymore (in most games). DLSS is too good.
If you have AMD, well yeah FSR looks worse than native.
Able-Reference754@reddit
This isn't an upscaling quality comparison video, rather its supposed to be a 4k gaming benchmark video so they should benchmark in 4k. Nobody gives a fuck about if it supposedly looks the same as 4k (it doesn't, except maybe if compared to TAA which ruins the image by itself).
chasteeny@reddit
Its not a 4k benchmark video either... its a real world use case 4k gaming video. Id wager even most 4090 owners still upscale anyways.
kyledawg92@reddit
I don't know why both the community and Steve are acting like this is something to argue about. Everyone understands that you want to eliminate GPU bottleneck in order to test CPUs. However people also do want a "performance guide" for their games to see if it's worth upgrading in a higher resolution scenario. I don't see how that's irrelevant or something to dismiss.
If anything, this video to me just proves that should be something to continue tracking, especially as GPUs become more powerful and the CPUs can actually separate themselves on the charts.
deadheadkid92@reddit
There's a bias a lot of people have about science that if the results aren't different or interesting then the science itself is worthless, which couldn't be farther from the truth.
A cpu review at 4k that shows all of the latest chips performing the same in a gpu-bound scenario might seem worthless to people that have watched dozens of these videos in the past but for someone who is trying to build a new computer with a 4k monitor it would be extremely useful in understand how this stuff works.
I think HUB Steve has performed so many of these tests in his work that he actually can't understand that perspective of a new buyer anymore.
timorous1234567890@reddit
How is it useful?
If I am building a brand new rig then I have multiple criteria. I want it to play the games I am currently playing well. I want it to last for several years. I want it to fit within my budget.
If my target res is 4K and I see that cheap CPU A performs the same as expensive CPU B then I will probably buy cheap CPU A and put more towards a GPU, which sounds great initially because it means I get more image quality in my budget but I have no clue how it will last because for all I know CPU A is just on the cusp of being CPU limited so games with more CPU requirements that are due in the next year or 2 might push the rig to being CPU limited below my target frame rate.
OTOH by seeing the 1080p or other lower res data I can see the spread of performance and decide on a setup that better balances my buying criteria. Sure I might sacrifice a small amount of image quality right now but if that means I can stay above my performance target for 2/3 years more because I went with a CPU that had enough headroom then to me that is a better balance.
LimLovesDonuts@reddit
Because some people are just plain misinformed. Just look for "9800x3d" in reddit and look at the amount of posts justifying why you shouldn't get the 9800x3d if you play in 4k which is misleading. There are still people that don't get why tests are done in 1080p and hopefully a video like this helps to educate and inform them on the rationale behind the methodology.
Strazdas1@reddit
Not to mention people who understand why tests are done in 1080p and just plain disagree that it is good way to do the test.
MdxBhmt@reddit
That's the point, so many don't.
DependentOnIt@reddit
Just more content. Really not much to it. We all know the 9800x3d is gonna sell gangbusters.
So amdunboxed releases a meta video about it instead
laffer1@reddit
I tried to point this out to hub on Twitter awhile back and they put me on blast with everyone. Steve is unable to think from other perspectives. He can’t see why people might want to see that and not just do scaling math
Flaimbot@reddit
for cpus, look at the 1080p fps numbers.
like, real hard.
memorize it like it's gonna be part of your finals exams.
now imagine on top of the table there's written "4k".
because they will match close enough to not matter.
everything else is gpu scaling and not part of a cpu review.
elessarjd@reddit
Exactly! This video is (unintentionally) the most helpful and informative 9800X3D video that has come out since release imo.
caedin8@reddit
They are constantly trying to argue about something because it drives money into their pockets through engagement
PotentialAstronaut39@reddit
I don't get the 1080p benchmark "ICK" when benchmarking CPUs.
You check the Steam survey, 1080p is still at 56.98% of users. 1200p ( 1920*1200 ) and below ( cumulative down to lowest res ) is also still 65.58%.
1440p is only 20%, 4K is below 4%.
jecowa@reddit
That's how I feel about it. If 60% of Steam users were on 1440p, that's probably what most people would be testing with.
But also, 12-30% of those Steam users are on laptops, and most of them are probably using the built-in display, and so their resolutions don't really matter for desktop CPU benchmarking. Perhaps only like 34% of Steam users actually have an external 1080p display, and there's up to 30% who are in the market for a new display.
I still think 1080p is the majority and worth testing at, but that 60% figure is a bit inflated with the laptop users.
Early_Highway_3779@reddit
8700k at 5ghz user here. I guess i will see an important improvement with a 9800x3d upgrade playing at 4k right? Right now i can play at 4k everything with the 8700k paired with a 3070ti (obviously dlss + tweaking most demanding settings). But i can archieve 60fps minimum. Wondering how much improvement can i get changing only the cpu... (in a few months, gpu too).
KeyVarious5666@reddit
Right now i have a 3070, i was wondering if I should get a 4080 super or wait for the 5070/5080?
Early_Highway_3779@reddit
I'm on the same boat. 3070ti here waiting for the 5000 series release... All deppends on the price. If they release the 80's with a huve overprice i won't buy it and i'll go for a 4080 super.
SatanicRiddle@reddit
He seem to not get it.
The real world performance testing is to tell people how to distribute their budget.
Strazdas1@reddit
This switch would not be silly at all if you play sim-heavy games where even with a 4060 you would be CPU-bottlenecked.
basil_elton@reddit
Going from 1080p native to 1252p (DLSS balanced) makes the games tested GPU-limited with the 3090 Ti?
Something is off here.
Strazdas1@reddit
Yes. Selection of the games to test is off.
Jonny_H@reddit
Remember that if you increase the resolution in one dimension, the number of pixels increases with the square.
So 1080p->1252p actually has a ~35% increase in pixels generated. Plus whatever time DLSS takes to run, which is small but non-zero.
If pixel shader limited, this is a big enough gap to start seeing GPU limits if you're already pretty close.
basil_elton@reddit
Literally doesn't happen in two of the games tested compared to some of their old reviews.
5600X + 3090 in this video:
NVIDIA has a driver overhead problem
Both of these games are well-threaded, so with a 5800X instead of a 5600X, the FPS loss will be recovered. Not to mention that with a 3090 Ti which has a higher boost frequency and memory BW, there should be virtually no difference.
So if a 1.75x increase in pixels rendered doesn't induce a GPU limited scenario, a 1.33x increase in rendered pixels won't cause a GPU limited scenario either.
Jonny_H@reddit
Perhaps the workload they're testing isn't pixel shader limited? Do those apps have canned benchmark runs? A slightly different test location/pass could easily show large differences. I know sometimes HUB "dislike" those canned passes and might even avoid them (as they don't think they're not a good representation of the ingame performance itself).
And driver/app version differences means that a result from over 3 years ago likely isn't comparable to new runs today.
basil_elton@reddit
These have built-in benchmarks, and are old enough to have limited scope for performance improvements with driver updates.
The point is that DLSS balanced is in-between 1440p native and 1080p native, so it would only slightly increase GPU load, and should definitely not cause CPU differences, if there was any at 1080p, to vanish altogether.
Jonny_H@reddit
Again that assumes everything scales perfectly on pixel count, and DLSS itself has zero cost. Or there isn't some other limit that doesn't scale well with SM count between the 3090 and 3090ti.
And I wouldn't discount performance changes over time - 3 years is a long time in driver development, lots of things will likely have changed even if it wasn't targeted at that particular game, it all ends up in the same software layers and shader compiler as "newer" games after all.
My point is scaling not being what you expect between two old benchmarks isn't really the "gotcha" you seem to think, it happens all the time.
You shouldn't assume anything in performance that isn't directly tested - there's so many moving parts it's often surprising.
basil_elton@reddit
This isn't about the reasons behind the difference but the magnitude of the differences between 4K balanced upscaling and 1440p compared to 1080p.
ferom@reddit
Not really. It's a 33% increase in rendering resolution and some overhead from DLSS.
Savage4Pro@reddit
The 5090 (or 6090 or 7090) will show eventually for those who want to see a difference in 4k results.
JDSP_@reddit
For the current selection of games, for games in 2 years time it for sure wont
Strazdas1@reddit
Then select better games. why do they insist on testing CPUs on the most gpu-bound games?
Noreng@reddit
It's been 7 years since we started getting 8-core CPUs for regular platforms, and in that time we have gone from most games seeing benefits from more than 4 cores, to most games seeing small benefits from more than 6 cores.
Games work on such short timescales that games are rarely benefitting from more cores. The overhead introduced by splitting up the tasks to run on more cores can often cause the game to run slower overall because the main thread spends more time trying to synchronize the game state than actually doing stuff.
IANVS@reddit
Indeed. I remember the hype around 8-core CPUs years ago and people raving how games will be using those cores and threads and 6 cores aren't enough anymore and 4 core CPUs are dead, and...meanwhile, years later, we are where we are and 6-core CPUs are still more thst adequate for gaming. Hell, even 4 core ones would still be used if Intel and AMD haven't abandoned them over monetary reasons, not the lack of power.
Consoles are still the lowest common denominator and as long as multiplatform games are a thing and consoles keep having mediocre tier hardware in them, the CPU situation you mentioned is not going to change.
crshbndct@reddit
I wouldn’t call them that mediocre, they are still the best value for gaming. A GPU with comparable speed to a console costs more than a whole console, still.
tukatu0@reddit
A rx 7600 doesn't cost what a ps5 does. Neither does a 3070 cost $800. More importantly you are going to be comparing it to a 5060. Not 5-7year old hardware when most people will buy ps5 pros.
Please Don't repeat those lies that consoles are cheaper/better value than pcs. Today they are not. Well except the series s but that thing is worthless to me so ill blatanly ignore it. The ps5 sub loves to say that but it is not true once you add online subcription
crshbndct@reddit
I just looked at the best priced PC parts retailer around here; and an RX7600 is the same price as a PS5.
Not just that, but the PS5 is everything. The GPU can’t play games without a PC to plug it into.
tukatu0@reddit
Good lord wtf. It's like $250 in america. A completely new zen 5 build can be as low as $500.
Trying to match a ps5 would however need to go used 3700x or similar to match. It's kind of possible for $350 with only the cpu and motherboard being used. Combined with a $200 6600xt that puts it at $550.
But yeah... It's such a hassle it's completely reasonable to just click and buy and be done with the console process. I'm sure $300 used ps5s exist out there somewhere
So sorry. My own bias with the ps5 pro and ps5 subreddit influenced me more than i thought.
crshbndct@reddit
I suspect it is that way in most of the world too. Consoles have a lower cost of entry and nowadays lower TCO just because of the extortionate prices on PC hardware
laffer1@reddit
There are some games that will use all cores like cities skylines 2. At the end of the day, Amdahl’s law applies though
DryMedicine1636@reddit
I upgraded from 10900k to 7950x3d, and there's almost zero difference at 4K DLSS performance/balanced Cyberpunk PT. Some games do see a sizable improvement, though.
And if I'm going over the frame limit, then I'd rather bump DLSS to quality or even DLAA, especially for non-competitive games.
Ryoohki_360@reddit
Cyberpunk PT is GPU bound at 1080p Pretty sure it will take at least 2 gen before will see CPU bound in PT for any that has it (Alan Wake 2, Wukong lor the Portal Remasters). There wont be that many PT game because console can't do it. Mostly game financed by Nvidia for a forsable future
Hugejorma@reddit
You think it's GPU bound 1080p with PT, but I'm getting low 30 fps full CPU bound scenarios even at during 4k ultra performance PT CPU tests (fully CPU bound scenarios). I linked even 4k DLSS performance videos clips from multiple sources that show limitations even without PT turned on.
You can super easily get CPU bottlenecked situations when you use PT + high crowd density. I even run multiple test sessions to validate these findings, because I woudln't first believe low massive CPU hit the PT was. It's hard not to get CPU bottlenecked when using CPU like 5800x3D. With better CPU, PT would run way smoother (frametimes) + offer overall better experience.
PS. I have 15+ years of CPU & GPU testing & monitoring AAA games. Especially CPU side testing per core/thread side.
DataLore19@reddit
What games did you see a big improvement in? Just interested because I have a 10900k and am thinking about upgrading but I haven't found any games with really high end graphics (CP2077, Alan Wake 2, Hellblade 2 etc.) that are very CPU limited especially when I turn on frame gen. I have a 240hz monitor but don't play e-sports titles.
DryMedicine1636@reddit
I only have 120hz 4K and 144hz 21:9 1440p. The good ol' GPU usage is not perfect, but a pretty low effort indicator.
I think it was Dying Light 2 that got me to upgrade.
DataLore19@reddit
Thx!
F9-0021@reddit
Future games will use the CPU differently than current games as well. If you go back 5 years, Shadow of the Tomb Raider doesn't use the CPU nearly the same way as Cyberpunk 2.0 does. You could probably get away with a fast 4 core chip like an 8600k in Shadow, but that won't work in Cyberpunk. Cyberpunk 2.0 at max settings is almost too much for a 6 core chip now, and there's not much margin on an 8 core. In another 5 years an 8 core chip might be in the same boat as a 6 core chip is now.
jacket13@reddit
Yes but the cache with a X3D CPU is different. Since these CPUs have increased L3 Cache they can handle bigger data chunks with each clock cycle. Move more water because the CPU has a bigger bucket in layman terms.
Games cant leverage this unique trait but we have seen it once before in the core 2 duo era, where CPUs suddenly had more L3 cache and games got a huge FPS boost because of it.
It is different from having more cores available to the game, it has to be hard coded into the game to utilize more helpers.
RogueIsCrap@reddit
Most likely even the 5090 won’t be able to handle games like Cyberpunk and Alan Wake 2 at native 4K 60 with max settings. 60 fps is pretty low too for pc gaming nowadays. Most 5090 gamers would want to aim for 120 fps and use AI upscaling instead, which is why 1080p/1440p CPU performance is more important.
tukatu0@reddit
I was wondering what you meant. I thought alan wake 2 ran at 4k 40fps. A 50% uplift should be more than enough for 60fps. Though not stable. Turns out it runs at 35fps.
That l2 cache is helping path tracing massively at the lower than 1080p resolutions. Well i guess that is something ¯\(ツ)/¯
_Cava_@reddit
You think the jump from 4090 to 5090 will be similar to the jump from 1080p to 4k?
Ryoohki_360@reddit
5090 as, New tech, new GDDR (GDDR6x already is constraining the 4090), new RT core, higher wattage roof etc. Difference will be pretty considerable
Azzcrakbandit@reddit
It's on the same node though.
Noreng@reddit
The 4090 is performing extremely poorly relative to it's SM count, in a similar fashion to Kepler in many ways.
redsunstar@reddit
Isn't that also the case for the 7900XTX (well not SM, but stream processor)?
It's been my impression that performance isn't scaling that well anymore that the top end with the number of cores regardless of AMD or Nvidia. There's probably some inefficiency that needs to be solved in the front end.
tukatu0@reddit
I recall seeing rumours that the problem had been found and fixed. That is why we are getting an arch change this gen. 4080 that was already a bit of a xx70 .Being restructered is something i would hope that beats the 4090.
Noreng@reddit
RDNA3's problems isn't that the CUs aren't fed properly, there's just not enough of them.
Dangerman1337@reddit
So was Kepler to Maxwell.
Azzcrakbandit@reddit
Correction. Kepler was 28nm while maxwell was produced on 28nm, 20nm, and 16nm.
OftenSarcastic@reddit
Clarification. The Maxwell chips used in the GTX 900 series graphics cards were 28nm.
The 20nm and 16nm Maxwell products were Tegra/Switch SOCs.
A quick trip to TPU's old review suggests the GTX 980 was 7.5% faster than the GTX 780 Ti, while consuming 31% less power. A 56% increase in performance per watt while on the same 28 nm process.
tukatu0@reddit
Meanwhile the 4060 get's a mere 30% power reduction over the 3060. 20% uplift below 1080p thanks to it's l2 cache.
Well point is. Even if there is a big increase. No reason to believe it will actually be given.
techraito@reddit
Not natively. There will be upscaling tricks like DLSS to get more acceptable framerates.
TimeGoddess_@reddit
Yes inversely. If the 5090 is about the same improvement as the 4090 was 75-100% over the 3090. then you'll see the same performance at 4k with a 5090 as a 3090 at 1080P.
I don't expect it to be that much faster though, but Its a mixed bag right now, the most accurate leaker for NVIDIA GPUS says the 5080 is 4090 plus 10% and the 5090 is literally double the 5080 in size so i'm not sure what to expect
dudemanguy301@reddit
Something tells me 4x the resolution is going to need more than just 2x the performance.
Noreng@reddit
Just generate more AI pixels. With DLSS frame gen and DLSS Ultra Performance, you're looking at 17/18 pixels being generated by AI.
We can obviously go further
bctoy@reddit
About 2.5x using this 3090 vs 6900XT comparison at 4k and 8k.
https://www.youtube.com/watch?v=UBb8UF9DKfQ
TimeGoddess_@reddit
Thats not how it works, only in PT does performance scale linearly with resolution. you usually need about 2x the performance to get the same performance at 4k vs the half slower card at 1080P
https://www.youtube.com/watch?v=cmjnT0wmCBo
this video is an oldy but goldy. and you can just look at any gpu review and compare the half slower card at 1080p average performance vs the faster at 4k like hardware unboxed or tpu
dudemanguy301@reddit
Raytracing also tips the scales in the direction of linear scaling and is much more common, and a use case that is easier to argue when speculating on the 50 series.
Savage4Pro@reddit
Nope, definitely not.
Jeep-Eep@reddit
The 9800X3D will serve a gaming rig until the final chips on AM5 very well.
empty_branch437@reddit
Why doesn't anyone test 720p?
OGigachaod@reddit
Show me a gamer with a 4090 that games at 720p.
empty_branch437@reddit
Nobody games at 720 with a 4090, the conversation is about CPU testing.
dedoha@reddit
Unpopular opinion here: Because it's a waste of time for the most part and I'm yet to see anyone draw any convincing conclusions out of them. It's supposed to predict the future but 2020 games achieving 200fps at 720p has no bearing on 2026 games ran at "real world" settings
conquer69@reddit
Look at the gains of the 9800x3d over the 7800x3d at 720p and 1080p on the TPU site.
6% faster at 720p and 3% faster at 1080p. The generational gains were cut in half when increasing the resolution. That's a gpu bottleneck.
The reason why "real world" testing isn't done is that it's different for every person. Maybe your real world is playing at 4K native while mine is 4K with DLSS Quality at medium settings but RT on. We can't expect reviewers to cater to each of our individual needs.
Seriously, read this thread. There is someone complaining he used balance instead of quality and another commenter complaining he didn't use performance.
Just test the cpus at 720p to get their data, test the gpus at each of the 3 resolutions and cross reference with whatever your budget or build is. Then remove a couple percentages to account for DLSS or FSR because they aren't free. That's what everyone should be doing in their own heads instead of demanding the reviewers do it for them.
BUDA20@reddit
Hardware Canucks
did, and even if the likes are hugely positive, there are absurd angry comments too, and that's why...
https://youtu.be/8H0xeRE21_w?si=BqA-z7YCfgj6nRZ3&t=302
TheRealSeeThruHead@reddit
They better test the 5090 with every single cpu that’s come out in the last 4 years then.
OGigachaod@reddit
They also got a whole bunch of retesting to do after intel fixes their new CPU's.
mapletune@reddit
in my opinion, this HUB video should have cut 70% out and just kept the longevity part. i think that illustrates the difference better.
ok maybe not cut it out entirely, but make it shorter to the point. only reasonable people watch bigger portions of boring data presentation, and unreasonable people will just read title and comment. if it was a 10 minute video, maybe some of those would actually try to understand.... nah~ i'm being too optimistic
_OVERHATE_@reddit
Wait for Nvidia to release the 5090, then the 5090ti, then the 5099 all with 400$ of premium towards their previous one
HandheldAddict@reddit
The point of a CPU benchmark, is making the assumption that consumers will upgrade their GPU's more frequently than their CPU's.
Since a CPU upgrade generally means motherboard and memory upgrade as well.
While a GPU upgrade is usually a drop in replacement.
Nameless_Koala@reddit
If only this CPU was easy to find
Z3r0sama2017@reddit
To be fair, 9800x3d is a great cpu if your upgrading from 5800x3d or older Intel cpu's.
It's little wonder it's getting dogpiled by everyone and their mother who upgrades every 2/3 cycles.
ImMattHatter@reddit
do you think it will be a reasonable upgrade going from i9-9900k to the 9800x3d? mainly playing games at 1440p occasionally 4k.
2ndFloosh@reddit
It'll probably be a 4-5 years before I can afford 4090-level performance, I might be sticking with my 5800X3D and 7900XTX for 4K/60Hz gaming until AM6 comes out.
ApprehensiveBerry202@reddit
More like AMD Unboxed.
Toffly@reddit
This CPU got me drooling. Will be a solid upgrade from my 9700k!
AlphisH@reddit
I really can't wait for 9950x3D....
R1250GS@reddit
9800x3d is fantastic in any resolution depending on the video card you have, and monitor. 1080p? Nothing is going to touch it. But, here is an example.. I have a 7950X, with 4090, 4k monitor, and this CPU would make absolutely no sense to me. My other system is a 7600x with a 4080 super, 4k monitor, and it's just fine playing games at max settings. However, if I had a 3080 or less, where I had to game at 1080p or 2560x1440, then the 9800x3d would be a buy. Most reviews are showing this chip at 1080p, so its nice to see some 4k comparisons, where the GPU takes up the slack.
diak@reddit
Why is he testing 4K with balanced upscaling in games where hes getting over 150 fps?
basseng@reddit
Because he's intending this to be a realistic use case for the majority of gamers. This isn't a benchmark review (they did that) this is to show the actual difference most people will experience.
And balanced DLSS looks better than native (TAA) in all new titles, only games on older DLSS versions look worse (and you can dll swap those).
FinalBase7@reddit
He ran a poll and most people said they run DLSS quality at 4k not balanced, yet still benchmarked balanced, also you're way off, not even quality mode looks better than native in most games, only in select ones, balanced is a significant downgrade from quality, all depends on the game of course but using balanced when you're getting well over a 120FPS is not very realistic.
basseng@reddit
First, you're just flat wrong about native looking better, all these titles use TAA, which looks so damn ugly that the AA provided by DLSS is a huge improvement. Yes you could use DLAA which would look better but that isn't native anyway. Sure you can see the difference If you're pixel peeping or A/Bing screenshots; however in action, in motion you cannot tell 99% of the time (only time you can is when you get rare DLSS artefacts, which are marginally worse in balanced).
Second, I bet most people who answered that poll don't have a 4090 (which is the only relevant care when you're talking about pairing it with a 9800X3D), and don't even game at 4K - that's the problem with online polls anyone can answer them. I'd also bet this video would have already been in the can (or at least all the benchmarking done) before the results of that poll were available. Most YT creators film at least couple of days ahead (due to editing and such).
Finally, there is a damn good reason to use balanced, because while the average may be over 120 in their benchmarks that is NOT true in reality. Their benchmarks are just a slice of a game, a repeatable slice.
The reality is most if not all of GPU demanding games they test that I've played may average 120 or more, but 1% lows can dip to the 70s, specifically, Warhammer, TloU and remnant 2.
Balanced significantly mitigates that in my experience (7800X3D + 4090) resulting in smoother games (capped to 117 for the VRR).
basseng@reddit
The poll was 2 days ago, the benchmarks for this video would have been long done, they'd have already been editing it by that point if not already finished.
Most people using that poll use quality because they don't have 4090s.
There is not a single game that justifies using quality outside of path tracing (all 2 titles) - because quality is a noticeable image quality loss, balanced it not.
Balanced looks as good (if not better) in most titles as native - that is why they used it.
livemau5_01@reddit
No one is playing balanced upscaling on a 4090 lmao. Even quality can be blurry af depending on the game.. unrealiiiiistic
JensensJohnson@reddit
I play with upscaling on virtually all the time and I haven't came across a single game where Quality or even Balanced preset would look anywhere near close to "blurry af"
basseng@reddit
Spoken like someone who doesn't game at 4K on a 4090.
I do and I do.
CoronaLVR@reddit
Is it really realistic to play at DLSS Balanced when you get >150 fps instead of switching to DLSS Quality?
basseng@reddit
Depends on the game really and if balanced to quality is an appreciable increase in fidelity.
In my opinion it is not, apart from pixel peeping I cannot tell the difference between balanced and quality for new AAA games even when I sit right in front (2-3 feet) of my oled TV. Even to the point it's uncomfortably close (barely in my field of view anymore like sitting 6-12 inches from a 27 inch), which I only do when a/b'ing settings.
Performance is a noticeable drop in clarity though unless sitting well back.
EJ19876@reddit
A weird choice of CPUs given anyone who cares about gaming is going to look at the 7800X3D and 14900k, not the 7700X or 285k. The 5700X3D also makes more sense to use in this comparison if they're wanting a lower priced alternative.
Flaimbot@reddit
it's to show the performance delta. he might have explained the very same thing with a 2600k and a 8700k, and you'd still moan that nobody uses those cpus, because of not understanding how he's explaining the concepts and principles of cpu perf not scaling with res, but gpu bottlenecks doing so.
it boils down to "if a cpu hits 120fps at 1080p, it hits 120fps at 4k - if you have the gpu to support it"
benjiro3000@reddit
The problem is, does a 7800X3D or 5700X3D make even sense, for people who are NOT running 4090's, or whatever 1000 Euro card.
Even a 3090TI is getting GPU bottlenecks the moment you start pushing higher resolutions.
Somebody who has the cash to get a 4090, is probably going to use the latest and greatest CPU, like 9800X3D.
But all the other "blebs" that game on 1440p, well, do you matter? For me 1440p is the perfect middle ground between visuals, no need for AA, and a good balance for the amount of money you spend on GPU/CPUs.
I have run a 5800X, 5700X3D, 7700, 7945HX, all on the same 6800, and frankly, unless i specific bench, can not spotting any FPS issues for the games at 1440p. The bottleneck has always been the GPU, despite it being a 6800.
There are games that can benefit from that 3D cache at any resolution, especially factario and the like, where as the game progresses, it grinds to a halt because of the increase in elements they needs to calculate.
I don't know, most reviews feel so, boring these days because if you stop looking at the numbers, and actually use whatever CPU you have, unless you do a good jump, like multi generations, you often do not even see that 10%, or 20% uplift. I have literally downgrade between system upgrades, and gone "i can barely spot a difference, and after a few minute, that goes away".
A lot of hardware simply bottlenecks on the next issue.
Tuxhorn@reddit
What other CPU in the budget of a 5700X3D would you get for a gaming focused build?
AnimalShithouse@reddit
The 5700x3d makes a lot of sense because it's just very cheap and very good. Most people would be fine to stay on AM4 into the late 2020s, only upgrading the GPU as needed --> if their workloads are primarily gaming.
I think I'm one of the few out there that still will recommend an AM4 new build over AM5 when on a budget. You can shave off $100s in the total build still and get exceptional performance off am x3d chip + DDR4. And "good enough" mobos can be had for <$100.
TheRealSeeThruHead@reddit
For a video explaining why they don do 4k benchmarks. The actually interesting part was……the 4k benchmark numbers.
I actually DO care if paying 200 more can increase my 1% lows by 10fps in games I’m actually going to play with settings I’m actually going to use.
No it’s not a way to compare cpus. But again, I could not care less about how cpus compare against each other outside of the context of 4k ultra gaming.
All I care about are those 1% low numbers.
PixelatumGenitallus@reddit
Then after you watch the CPU benchmark, see 4K benchmark for the GPU you're using at the resolution and settings you use. Compare the 1% lows between the two. If the CPU 1% low is higher than the GPU's, then the numbers you'll see with the combo are the GPU benchmark results.
Just make sure the CPU being used for the GPU benchmark are not being a bottleneck at 4K (it doesn't have to be the same CPU that you'll about to buy, it only has to be faster at 1080p than the GPU at 4K).
KayakShrimp@reddit
I'm in the same boat. Beyond a certain point, I don't care about average framerate at all. I only care about 1% / 0.1% lows at realistic settings. That's where the perception of smooth gameplay comes from.
I don't care about average frames at 1080p low. I'll never play a game at 1080p low. I'd rather see benchmarks for what I'll actually use the CPU for. I don't want to guess how some spherical cow-type numbers will scale to the real world.
MajorTankz@reddit
Low res benchmarks are still useful for you guys.
For example, if a low res benchmark shows a 1% low of 100 FPS, then that is the maximum conceivable performance you can expect with that CPU.
Put in another way, if your aim is a minimum of 144 FPS, then a benchmark that shows 100 FPS at a low resolution shows you plainly that this CPU is NOT fast enough to meet that performance target. No amount of GPU power or resolution changes will change that outcome.
This is SUPER useful for people who are picky about performance and have specific performance targets in mind. It's also especially useful for people who have longevity in mind. If this CPU barely meets your performance target today, it probably won't anymore in 2 years.
semitope@reddit
shh. go pay few hundred more for fewer cores and no gains with the config you play with.
TheRealSeeThruHead@reddit
the video directly contradicts what you're saying
semitope@reddit
I'm not talking about the processors they have in that video. I don't know why they didn't have their usual broader lineup.
Nor does the video represent all scenarios.
GarbageFeline@reddit
And it's possible to get that understanding even from 1080p data as it scales to higher resolutions. All of the benchmarks they showed with higher lows show a similiar scaling at 4K.
TheRealSeeThruHead@reddit
It not possible or practical for the average person to extrapolate from 1080 to 4k numbers.
Every game scales differently.
And what we really want to know is actual fps on our setup with new hardware (without having to purchase it ourselves).
The % differences at 1080p never hold up at 4k. They’re always a lot lower but not insignificant enough to brush off as meaningless.
No_Screen4750@reddit
Dumb take
lebrowski77@reddit
This benchmark wasn't really needed as it's results were very predictable from looking at the earlier 1080p data. I'm much more interested in knowing how much the 3d vcache helps with stutter in unoptimised crap like elden ring and unreal engine games. If on the 7700x , there are occasional 250ms stutters, can the 9800x3d bring it down to <50ms. If so, then it becomes worth the premium for me.
autumn-morning-2085@reddit
Think you will have to look for specific user reports for such data (as unreliable as they are). And shader-comp like stutters are hard to measure consistently. I doubt vcache will have much effect there, can't expect more than 25% improvement for something compute-bound.
GeographicPolitician@reddit
I agree with a what he is saying. However, I think 1080 low(even 720)/ ray tracing enabled should be included in cpu bench marking these days as there is a very real cpu demand with this these days.
Also, I think a lot of these reviewers also need to go back and do a different video with cpu testing. And test the 7900 xtx or Intel GPU(or whatever the best amd gpu/Intel gpu is at the time) with these cpu's as we may get different results in a lot of titles with how different the throughput is with these cpu's in relation to feeding the gpu with frames. A lot of customers go with a certain GPU brand in relation to these needs. It would be good to not just test Nvidia to guess which cpu is best and if it translates to the other brand you need.
I run Linux and I only run AMD for this reason.
Sylanthra@reddit
After watching this video, I want to see more 4k cpu testing not less. It looks like modern games created for ps5 are actually cpu limited at 4k unlike games made for previous generation of consoles so upgrading to a more powerful cpu can actually benefit you right now even if you are playing at 4k as opposed to how it used to be where it was purely an exercise in future proofing.
glowtape@reddit
Eh, once the amount of UE5 based games with Nanite assets ramps up, it might change things. Unreal Engine actually renders parts of Nanite assets on the CPU, if the micro-triangles are small enough to become a bother for the GPU (see quad overdraw) and then merges it with the GPU render of the larger triangles in the scene. So at some point, the CPU will have some influence.
SJGucky@reddit
HWUB made 4 errors.
First they used upscaling for 4k and not native.
Second they used balanced instead of performance.
Third they didn't use a 7800X3D or 5800X3D.
Forth they didn't use a 5090. :D :D :D
The gain for native is almost zero and the gain for performance would have been higher.
FYI, I use a 5800X3D and a 4090 to play in 4k with DLSS Performance (1080p->4k).
And I know that 1080p is the closest test for my setup.
HWUB made the video just to throw out some ragebait. :D
fLp__@reddit
Ye I get it why CPUs are tested this way, my problem is that the questions I have are still unanswered.
AnthMosk@reddit
As posted days ago: https://www.reddit.com/r/pcgaming/s/FLzaRinTIW
tsibosp@reddit
I'm in the process of building a new pc and I ll go with the 7600x for the time being, just waiting for the 5080 to launch. Gonna be gaming on 4k and 1440p on an Lg G4.
I dont see any reason to diss another 300€ for the cpu plus the extra €€€ for the better mobo.
If I feel like it's not enough I can always upgrade to a 9800x3d or something in 2-3 years.
semitope@reddit
Did they give a reason for only using the 7700x and the 285k that still has issues with performance? usually those aren't the CPUs its on par with at 4k.
Noble00_@reddit
☠️There is no way the PC hardware community went from Zen5% and Arrow Lake 2.85% surrounding performance uplifts around gaming only to having an issue with benchmarking CPUs at 720p/1080p in 2024.
Courtesy from u/Voodoo2-SLi from his meta reviews, Zen 4 to Zen 5 we got a 9% uplift in application, 4% uplift in gaming, and 30%, 3% better energy efficiency for application and gaming respectively.
Raptor Lake to Arrow Lake we got a 5% uplift in application, 6% regression in gaming (pre-patch if that means anything in the future), and 40%, 50% better energy efficiency for application and gaming respectively.
/s Guys just buy a 5600 or 12400f and pair it with a 4090 or buy an M4 mac if you truly are disappointed with this gen.
I also want to bring up Steve's other videos to help clarify. Here is the Ryzen 5 3600 vs the best CPUs in 2020, Ryzen 9 3950X and Intel i9-10900K paired with a 3080 and at 4K in this 15 game average there wasn't much if any difference.
3 Years later if you're sticking with the same 3600 and 10900K (well 9900K in the video) BUT pair it with a 4090 look how much a v-cache Ryzen, 5800X3D performs at 4K.
I think Steve said it here in this video the best (6 months ago).
Here is also r/hardware messiah, Steve from Gamers Nexus on the matter (in the pinned comment section):
There are a lot of ways to approach reviews. We view bottleneck testing as a separate content piece or follow-up, as it also starts getting into territory of functionally producing a GPU benchmark.
What matters is a consistent philosophy: Our primary philosophy is to isolate components as much as possible, then as standalone or separate feature pieces, we run 'combined' tests that mix variables in ways we wouldn't for a standardized reviews. For us, reviews are standardized, meaning all parts (more or less) follow the same test practices. Introducing more judgment calls introduces more room for inconsistency in human decision making, so we try to avoid these wherever possible to keep comparisons fair. Choosing those practices is based upon ensuring we can show the biggest differences in components with reasonably likely workloads.
A few things to remember with benchmarks that are borderline GPU-bound:
- You can no longer fully isolate how much of the performance behavior is due to the CPU, which can obscure or completely hide issues. These issues include: poor frametime pacing, inconsistent frametime delivery, in-game simulation time error due to a low-end CPU dropping animation consistency despite good frame pacing, and overall quality of the experience. This is not only because it becomes more difficult to isolate if issues such as micro stutters are caused by the CPU or GPU, but also because the limitation may completely sidestep major issues with a CPU. One example would be Total War: Warhammer 3, which has a known and specific issue with scheduling on high thread count Intel CPUs in particular. This issue can be hidden or minimized by a heavy GPU bind, and so 4K / Ultra testing would potentially mean we miss a major problem that would directly impact user experience.
- Drawing upon this: We don't test for the experience in only that game, but we use it as a representative of potentially dozens of games that could have that behavior. In the same example, we want that indicator of performance for these reasons: (1) If a user actually does just play in a CPU bind for that game, they need to know that even a high-end parts can perform poorly if CPU-bound; (2) if, in the future, a new GPU launches that shifts the bind back to the CPU, which is likely, we need to be aware of that in the original review so that consumers can plan for their build 2-3 years in the future and not feel burned by a purchase; (3) if the game may represent behavior in other games, it is important to surface a behavior to begin the conversation and search for more or deeper problems. It's not possible to test every single game -- although HUB certainly tries -- and so using fully CPU-bound results as an analog to a wider gaming subset means we know what to investigate, whereas a GPU bind may totally hide that (or may surface GPU issues, which are erroneously attributed to the CPU).
One thing to also remember with modern 1080p testing is that it also represents some situations for DLSS, FSR, or XeSS usage at "4K" (upscaled).
A great example of all of this is to look at common parts from 4-5 years ago, then see how they have diverged with time. If we had been GPU-bound, we'd have never known what that divergence might be.
Finally: One of the major challenges with GPU-bound benchmarks in a CPU review is that the more variable ceiling caused by intermittent GPU 'overload' means CPU results will rarely stack-up in the hierarchy most people expect. This requires additional explanation to ensure responsible use of the data, as it wouldn't be odd to have a "better" CPU (by hierarchy) below a "worse" CPU if both are externally bound.
We still think that high resolution testing is useful for separate deep dives or in GPU bottleneck or GPU review content.
Snobby_Grifter@reddit
Using a 4090 with 4k upscaling is not the same as using a 7800xt or 4070 super. The gpu usage on a 4090 are simply low enough to show proper cpu scaling. Even 1% behing better, you need to be clear of a gpu limit.
This is a tale as old as time. Mid range GPU's pair great with mid range cpus. Unless you're running your midrange gpu at low resolution. Then go all out on your cpu. You should know if your titles are cpu bound, vs GPU bound, and by how much.
smirkjuice@reddit
they're running out of numbers. Imagine presenting the "AMD Ryzen 9 10955X3D"
KirillNek0@reddit
Yeah-yeah.... Whatever, HU.
dwausa@reddit
Took the time to show the 5800x3d, didn’t bother to put the 9800x3d in the same chart….
FitCress7497@reddit
Lmao 4k balanced, what is the diff from 1080p with that? It's like 1300p. Atleast use quality
martylardy@reddit
AMD cry boys crying at 1000 fps at 1080p low settings